Balancing expectation and reality: evolving our approach to evaluating advocacy
Monitoring and evaluating advocacy is not easy.
The ever-changing and long-term nature of advocacy, with multiple actors involved, is combined with policy makers often reluctant to credit the role of advocates in influencing their decisions.
Supporting community-led advocacy is core to the work of Frontline AIDS. How to evaluate and learn from advocacy is something we’ve been exploring for the past 15 years. Our approach was originally set out in Measuring Up: HIV-related advocacy evaluation training pack (2010). This emphasised the importance of developing a map of the advocacy strategy (a theory of change), and of negotiating with donors to design a realistic and useful advocacy evaluation.
Then through the five-year programme PITCH (Partnerships to Inspire, Transform and Connect the HIV response), we tried out a range of approaches in a diverse set of contexts with community advocates and activists- from sex workers in Myanmar, to people who use drugs in Nigeria to LGBT organisations in Uganda. Our learning was summarised in Measuring Up: Learning from Practice (2021) and reflected on how to juggle global and country level theories of change, the importance of developing realistic ‘advocacy asks’ and the challenges for advocates in completing and regularly reflecting on advocacy logs to strengthen their work.
Monitoring in advocacy is not a typical quantitative process done by monitoring and evaluation staff. Alongside the work of external evaluators, advocates themselves need to integrate monitoring activities into their advocacy work. However, all too often data generated as part of M&E processes mainly serves the interests of international development organisations and donors. At Frontline AIDS we have worked to address this by consulting advocates on the data that they need to strengthen their advocacy and supporting their use of that data.
In the United for Prevention project in 2023 we adopted a proactive learning approach. We hired the evaluation team (Southern Hemisphere) early on so that they could accompany partners throughout the project. This meant that advocates were supported in action learning to enhance their advocacy practice with monthly online meetings dedicated to learning.
Completing advocacy logs, which capture significant moments of change in real time, remained core to our approach in United for Prevention. We asked advocates to track both progress and setbacks in advocacy logs. When analysed, these provide insights into the effectiveness of different advocacy strategies and the responses of diverse stakeholders and advocacy targets.
Many advocates are used to sharing updates like this via social media, but channeling this into a more formal system (whether online or paper-based) is still something that requires a lot of proactive support. We have been using our online monitoring system built in DHIS2 (the world’s largest health information management system) as a platform for these advocacy logs which also offers a mobile phone app, which is helpful for advocates on the go.
There are different approaches to how the logs are kept up to date – some advocates prefer the freedom to make updates at times that suit them, while others prefer to do this as part of regular online meetings with the M&E team (though this can be resource intensive). This kind of tool generates useful insights on the effectiveness of advocacy strategies, as well as evidence that can support the work of evaluators when applying an Outcome Harvesting approach, which we use more widely in our work.
In monthly meetings Frontline AIDS worked with the evaluation team and advocates to nurture a culture of honest reflection and generate evidence of achievements, but also identify opportunities to learn from innovative practices, as well as when things simply didn’t go to plan. Often, setbacks sparked important discussions on obstacles, such as unexpected changes in government positions, elections, or the introduction of oppressive legislation.
Civil society organisations work to influence stakeholders while navigating an ever-changing political and socio-economic context which can shift dramatically. Successful advocacy usually depends more on adaptability than sticking to the blueprint of initial plans.
However, donors’ accountability requirements and advocates’ need for adaptability are not natural bedfellows. As we look to ease this tension and align the interests of donors and advocates, a common commitment to adaptive learning and periodic re-strategising is essential. Ensuring that advocacy strategies are living documents is core to this is, with changes documented and used as part of the evaluation narrative.
No matter how beautifully designed our M&E plan is, it needs to work in practice for advocates who are under pressure. High workloads, too few staff and competitiveness around funding are common themes. We see our approach to advocacy MEL as part of a longer process of deepening our understanding of how change happens, the role of different actors in that change, and what tools and approaches best support advocates to explore and learn.
To read more about the United for Prevention learning brief click here.
Category
News & Views