• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.


Barbara Klugman (2009) Advocacy Analysis

Page history last edited by Alexandra Pittman 9 years, 7 months ago

Barbara Klugman. 2009. “Less is More-Thoughts on Evaluating Social Justice Advocacy.”  Ford Foundation.  


Barbara Klugman argues in order to understand the complexity of change, M&E models must stop using linear rationalist logframe-like models, which do not account for the changing context and actors involved in change processes, and we must start identifying one’s contributions to change, not attributions. This is particularly important: “Given that policy wins and their implementation are always unpredictable and depend on a wide range of contextual factors and diversity of stakeholders, evaluation of policy advocacy needs to look for strengthened capacity in those factors that are most likely to ensure organizational/social movement readiness and creativity to initiate and engage policy processes in the most effective ways possible” (Klugman 2009:4). Klugman encourages donors to create systems that allow for the grantee to explore failures and challenges, while facilitating learning processes that produce results over time and across stakeholder groups. Specifically, Klugman suggests integrating Theory of Change models with other tools to most effectively track specific social justice and advocacy outcomes. In particular, she highlights seven different advocacy outcomes to measure, derived from a meta-analysis of successful advocacy efforts.[1]


Particular advocacy outcomes that donors should assess include:[2]

  • Strengthened organizational capacity
  • Strengthened base of support
  • Strengthened alliances
  • Increased data and analysis from a social justice perspective
  • Improved policies


Longer-term impacts, which cannot be attributed to a particular grant or set of grants include: 

  • shifts in social norms 
  • changes in population-level impact  (such as decreased violence against women, suicides of LGBT youth, or homelessness). 


At the donor level, focus was placed on ensuring that reflection happened both in terms of donor assessment mechanisms and the use and analysis of grantee reporting. Tools identified as potentially useful for the donor internal reflection process included: 

  • Grantee Assessment Reports, where grantees assess a donor in comparison of others that support their work, (for templates see www.effectivephilanthropy.org) and 
  • Learning from Grantee Reporting, where donors create a systematic way to learn from grantee reports such that lessons learned do not go unanalyzed. (The example she presented includes Hewlett Foundation’s annual event,“Best and Worst Grants from which We Learned the Most.”).


Unique tools that Klugman has identifies include:


Appreciative Inquiry

Appreciative Inquiry asks questions in ways that build trust and look at what has worked well, rather than what is not working – ‘think of a time when you were collaborating with another group and felt excited and it went well...’ Good process evaluation questions include: (Reisman et al 2007:34-35)

  • What was a peak moment when you felt best about the campaign/activity?
  • What have you learned that you would share with others doing similar work?
  • Did anything surprise you?
  • What would help you to be more successful?
  • One wild idea you have for improving the campaign?


The premise is that “asking questions influences thinking and behavior” (Preskill 2005 cited in Behrens and Kelly 2008:45). The process provides information on outcomes that have been achieved while building bonds among stakeholders. Information is “often qualitative and in story form, but they can be quite compelling” (Behrens and Kelly 2008:45).


The Action Learning Cycle from Barefoot Collective

The Barefoot Collective’s Planning, Monitoring and Evaluation Cycle uses the following questions to simulate organizational learning and assessment (Mason 2009: 110):

  • Action: What significant things happened? Describe the events. Who was involved, what did they do, what picture emerges? How did I/we feel?
  • Reflection: Why did it happen, what caused it? What helped, what hindered? What did we expect? What assumptions did we make? What really struck us? Do we know of any other experiences or thinking that might help us look at this experience differently?
  • Learning: What would we have done differently? What did we learn, what new insights? What was confirmed? What new questions have emerged? What other theories help us to deepen these learnings? What guidance do we get for the future?
  • Planning: So what does this mean for practice? What do we want? What do we want to do, to happen? How? What are we going to do differently? What do we have to let go of or stop doing? How will we not repeat the same mistake? What steps will we use to build these new insights into our practice?


Embedded in this methodology is attention to actions within and outside the organization/alliance/coalition, which fosters or constrains strategies or change processes.


[1]  See Annex 2 for a table Klugman created to highlight specific advocacy outcomes.

[2] Klugman (2009:5) notes: “A Foundation would not expect each of its grants to deliver all of the above outcomes, but rather that its mix of grants on one advocacy issue would collectively ensure organizational capacity that supports innovativeness; build an ever-wider base of support and ever broader alliances; enable ongoing research and refining of viable policy options; and engage in policy processes that would maintain past policy gains, enable policy victories and hold government or other implementing agencies accountable.”  See Appendix B for a useful example of what these outcomes would like for a specific organization.




Comments (0)

You don't have permission to comment on this page.