What to consider

How do you embed monitoring, learning and evaluation (MEL) into organization design? How do you demonstrate the value of MEL to your organization and stakeholders for it to become effective and affect change?  

“MEL” has become common talk in modern, purpose-driven organizations in the social impact sector. Despite MEL tools being used far and wide, not all users understand their usefulness for achieving organizational goals. Absent this understanding, organizations can often implement MEL systems that create new bureaucracies that require great time and expense, but those that don’t necessarily yield tangible gains in performance outcomes, resulting in pushback from participants.  

In simple terms, MEL is a study of cause and effect, and acts as feedback loop between an organization’s actions and the desired behaviours of its stakeholders. A great MEL system should, therefore, help you both prove and improve impact using reliable evidence. Prove by generating stories about your impact which can be used to build support to scale up areas of success. And improve by highlighting where you fall short and need a change in approach and identifying mistakes to be avoided in the future.  

If MEL systems drive excellence in organization performance, what drives excellence of MEL systems themselves? In our work, we have discovered the following eight lessons on effective monitoring, evaluation, and learning strategies:  

  1. MEL is a mirror of your organization / program strategy. Learning from M&E systems helps you understand how well the objectives at the core of your strategy are being met. When initially setting up your MEL systems, see this time as an opportunity to refine your theory of change or program strategy. Determine what goals matter most, and which activities help you achieve them with the time and resources you have. Identify how to measure these efforts – indicators against these become the skeleton of your MEL framework. As you process measurement data, constantly think about the strategic implications for the program regarding both immediate improvements and long-term opportunities for scale.    
  2. Measuring social impact is not an exact science and trial and error is important. A sound M&E strategy requires close understanding of ground realities of programs and their causal relationship with beneficiaries, availability of data, availability of measurement tools, and capacity of data collectors. Indicators that look thorough on paper need to be constantly refined to balance the trade-off between rigour and practicality. Can factor X be measured directly, or can you use readily available proxies that reveal what you need to know, indirectly? Run trials of a data collection process on the ground. Keep what works, tweak what doesn’t. Repeat till the core metrics have successfully yielded actionable insights.   
  3. Consequently, impact metrics are rarely one-size-fits-all. Identify a subset of metrics that are non-negotiable and core to the objectives and values of your philanthropic endeavors. These can be applied across all grants. Have another set that are good to have but adaptable based on regional and cultural contexts, 
  4. Process is as important as the outcome. A strong strategy and impact framework is an excellent starting point to develop a shared, consistent language around impact. How each user engages with this framework, from data collection, analysis, synthesis, to storytelling, predicts the long-term use of a strong M&E strategy. Ensure users have clarity on their roles and responsibilities in the chain of M&E command, and comfort with technical tools and analytic methodologies. This may necessitate an internal technical assistance budget to train and sensitise stakeholders to new processes. Check to ensure the tools you have mandated are not too cumbersome for participants and don’t distract from their core responsibilities. Consider rewarding and/or compensating participants for the additional effort they put in. You are likely to make back this investment through performance improvements enabled by MEL. 
  5. Ensure everyone is heard through this process. A mature learning organization looks at cycles of feedback constructively, not defensively. Vocal support from top leadership is key, but so is consent and buy-in from people eventually responsible for executing your M&E strategy. Keep your MEL development process open, deliberative, and consultative so a strategy and its processes are not finalized in an organizational silo. Walk grantees through the end goal and the co-benefits of participating in M&E activities. If local agents are expected to increase their efforts in collecting information, they should understand how this helps their own work. For example, a clear impact narrative brings greater visibility to the work of grantees, which can help them raise more funding and increase the scale of their programs. 
  6. Be mindful of culture and privacy concerns. Data collection exercises must respond to cultural sensitivities around privacy, gender, identity, and power dynamics. A mindful and ethical MEL strategy can go a long way in deepening trust with stakeholders. For example, public speaking and data sharing is a highly gendered role in many settings. Ensuring the presence of women data collectors on teams can yield insights from women beneficiaries who would otherwise find it difficult to express sentiments freely. Companies should look to employ security measures in data collection, minimize collection of personally identifiable information, and only share aggregated data with partners. 
  7. Good measurement should always lead to action. There is limited value in evaluation that simply results in a report, even if it identifies areas for improvement. Instead, all measurement activity should result in greater visibility into and clarity on the impact of your work. This requires action-oriented leadership to interpret insights in ways that key stakeholders will understand, disseminating these insights, and translating them into decisions—better design, tools, processes, resourcing, etc. The final stage of a good MEL strategy is to test it annually against the simple question: what change did it help us affect this year?  
  8. Finally, tell an honest, compelling story. Stories are an innately human process of transmitting information in an engaging manner. The insights derived from MEL activities are ultimately intended to inform and educate and profile those that your investments have touched. While statistics help quantifiably measure change, they are best nested in the context of people, places and things that your work engages with. Supplement your data with anecdotes, interviews, and audio-visual documentation to bring out the human voice and experience. Think back to the old adages that we all learned that ended with “The moral of the story is…” and see if your MEL stories end with a similar “So what?” 

To understand more about how Cicero can help your organization improve its learning processes and deliver effective impact narratives, please contact Jacob Allen.

Start a Conversation

Thank you for your interest in Cicero Group. Please select from the options below to get in touch with us.