Bridging the Gap: Strengthening Emerging Evaluators in the M&E Field

Bridging the Gap: Strengthening Emerging Evaluators in the M&E Field
The SAMEA Biennial Conference 2024 provided a crucial platform for emerging evaluators (EEs) to engage with established professionals, gain insights into the field, and reflect on their professional growth. Under the theme “VUCA-Vuka! Catalysing Change Through Monitoring and Evaluation,” the conference highlighted the need for adaptive, inclusive, and competency-driven evaluation practices that empower young evaluators. A key focus was on how emerging evaluators can transition into full-fledged evaluators, with sessions exploring networking, skills development, work immersion, and the sustainability of EE programmes.
The conference opened with an interactive session specifically for YEEs, “Weaving Connections,” facilitated by Dena Lomofsky (Southern Hemisphere) and Mutsa Carole Chinyamakobvu (Data Innovators). This experiential session tackled networking—an essential yet often intimidating aspect of professional development. Rather than viewing networking as a transactional exchange of contacts, the session encouraged YEEs to approach it as an intentional process of building professional relationships. The facilitators emphasised:
• Confidence in initiating conversations and maintaining professional relationships.
• Networking as a competency-building tool, connecting YEEs to mentorship and career growth opportunities.
• Positioning oneself in the field through strategic engagement with experienced evaluators.
This session was particularly impactful, as it reframed networking from an intimidating necessity to an empowering opportunity for professional positioning and growth.
A key session, chaired by Dr. Taku Chirau (CLEAR-AA), focused on the design and sustainability of Emerging Evaluator programmes. Panelists Ms. Samukelisiwe Mkhize (CLEAR-AA), Mr. Stephan Paulsen (SAMEA), and Ms. Dineo Madima (JET Education Services) shared insights into how these programmes can be strengthened for greater impact. The discussion underscored that while EE programmes have created valuable opportunities, they require intentional design improvements to enhance learning, sustainability, and career pathways:
1. Expanding Learning and Skills Development. EE programmes must move beyond general exposure to structured, competency-based learning, ensuring that EEs:
• Are actively involved in real-world evaluation projects, not just observers.
• Develop specialised technical skills through progressive learning models.
• Have access to interdisciplinary career pathways, beyond traditional M&E roles.
2. Enhancing Work Immersion for Meaningful Career Progression. Programmes must ensure that EE placements are structured for long-term skill-building, which requires:
• Extending engagement periods beyond one-year immersions to allow deeper learning.
• Ensuring progressive responsibility, enabling EEs to take on more complex roles over time.
• Implementing tiered remuneration structures that recognise skill advancement.
• Matching EEs with host organisations that provide structured mentorship and growth opportunities.
3. Strengthening Sustainability Through Co-Funding Models. The scalability and longevity of EE programmes depend on multi-stakeholder co-funding, ensuring that:
• Programmes can accommodate more EEs and expand beyond limited pilot initiatives.
• Public-private partnerships create diverse funding streams to support emerging talent.
• The programmes remain financially viable and accessible, particularly for underrepresented groups.
In a session chaired by Ms. Khumo Pule (CLEAR-AA and SAMEA), speakers highlighted the critical role of emerging evaluators in making evaluations more inclusive, participatory, and relevant. The discussion reinforced that young evaluators bring unique perspectives, innovative methodologies, and a strong equity lens that improve the effectiveness of evaluation findings. However, for their contributions to be meaningful, their involvement must go beyond token participation. This calls for:
• Embedding young evaluators in decision-making roles within evaluations.
• Ensuring their perspectives shape evaluation methodologies, particularly in participatory approaches.
• Recognising the value of intergenerational learning, where experienced evaluators mentor EEs while also learning from their fresh insights.
The co-creation session on shaping the Young Emerging Evaluators (YEE) programme, facilitated by Ilona Milner from UNICEF South Africa, sparked insightful discussions on how to strengthen and sustain these initiatives. It illuminated key areas where the programme can grow and improve to better support EEs, fostering their skills, knowledge, and careers. Below are the key learnings and insights drawn from the co-creation process, organised into four categories:
1. Not Doing but Should Be
The session underscored several vital actions that need to be prioritised for EEs to truly thrive:
• Sharing Case Studies for Collaborative Growth: There is an untapped opportunity to capture and share case studies from the M&E community with emerging evaluators. By presenting real-world examples, we can create a culture of shared learning that accelerates growth and collaboration within the evaluation space.
• Inclusion in Evaluation Teams: Emerging Evaluators should not just be passive participants but actively involved in developing commissioners’ terms of reference and evaluation plans. This inclusion empowers EEs with hands-on experience and helps them grow as valuable contributors to evaluation teams.
• Empowering Networking: While the programme connects EEs with opportunities, we must encourage them to take responsibility for their professional networks. EEs should actively create and attend events, cultivating relationships that will open doors and expand their career prospects.
2. Build – Continue/Strengthen
Several key areas were identified for strengthening existing efforts and building on them:
• Advanced Methodology Training: EEs need to be equipped with more sophisticated technical skills, particularly in advanced methodologies like Propensity Score Matching (PSM), Randomised Controlled Trials (RCTs), and Cost-Benefit Analysis (CBA). This will ensure they are prepared for more complex roles in the future.
• Involvement in Core Training: By involving EEs in training, especially in facilitation roles, we can develop their confidence and skill set, positioning them as future leaders in the field.
• Refining Recruitment: Ensuring that the right EEs are recruited for the right roles is crucial. A more tailored recruitment process that matches candidates with the appropriate level of expertise can increase the effectiveness of the programme and help EEs grow in their respective roles.
• Broadening Opportunities Beyond M&E: While the focus on M&E is important, expanding the scope of opportunities for EEs outside traditional M&E roles will allow them to explore diverse career paths and broaden their professional horizons.
3. Expanded Offerings
As the programme continues to grow, offering more diverse and comprehensive opportunities is critical:
• Online Certification and Soft Skills Training: Offering online certification programmes linked to professionalisation in the field will help EEs gain recognition and credibility. Complementing this with soft skills training, such as communication, leadership, and project management, will provide EEs with a well-rounded skill set that supports their long-term growth.
• Volunteer and Inclusive Opportunities: It’s important to create inclusive opportunities for all participants, including those with disabilities, to ensure that everyone has a chance to gain practical, hands-on experience in the field.
4. Key Objectives
To ensure the programme remains effective and responsive to the needs of EEs, several key objectives must be prioritised:
• Leadership Development: The programme should focus on cultivating leadership skills in Emerging Evaluators, ensuring they not only gain technical expertise but also the confidence and vision to lead in the evaluation field.
• Mentorship and Institutional Capacity Building: Institutions with limited M&E capacity should be paired with mentors who can provide guidance and support. Tailored programmes that align with the interests and needs of EEs will help enhance their work readiness and set them up for success.
• Continuous Improvement Through Feedback: Gathering regular feedback from EEs through exit interviews and tracer studies will help identify areas for improvement and ensure the programme is meeting its goals effectively.
• Extended Programme Duration: Extending the duration of the programme to 24 months will provide EEs with the time and space needed to develop deeper expertise and tackle more complex evaluation projects.
Several key recommendations emerged to enhance the development and impact of the YEE programme:
1. Strengthen Hands-on Experience: Design work immersion opportunities that progressively build practical skills, ensuring Emerging Evaluators (EEs) leave the programme with the confidence and capabilities to thrive in the field. This could involve active participation in core evaluation roles and applying theory to real-world projects.
2. Support Long-term Growth: Extend the duration of engagements beyond one year, ideally to 24 months, to provide stability and ample time for deeper learning and skill development. This extension will help EEs transition into sustainable careers with a stronger foundation in technical and evaluative skills.
3. Expand Networking and Mentorship: Create opportunities for EEs to connect with experienced mentors, coaches, and sponsors across various sectors. This will open doors to new opportunities, foster professional relationships, and provide personalised guidance tailored to each EE’s career goals.
4. Define Clear Career Progression: Establish a structured roadmap that guides EEs from emerging evaluator to full evaluator, ensuring clear career growth. This roadmap should be designed to bridge the gap between entry-level roles and advanced positions within the evaluation field, providing clarity on how to move forward in their careers.
5. Improve Recruitment Strategies: Ensure that the recruitment process matches EEs with host organisations that are genuinely invested in their learning and professional development. This alignment will foster meaningful growth, helping to cultivate evaluators who are both technically skilled and ready to take on diverse roles in the field.
6. Introduce Tiered Pay Structures and Flexible Programme Lengths: Acknowledge skill development through appropriate remuneration and flexible programme durations. Tiered pay based on experience and education can help motivate and retain EEs, ensuring fair compensation while also adapting the programme’s length to fit each EE’s needs and readiness.
7. Track Progress and Refine the Programme: Implement a robust system for tracking the progress of participants through exit interviews and alumni tracer studies. This feedback will provide valuable insights to continuously improve the programme, assess its effectiveness, and ensure that it meets the evolving needs of both the EEs and the sector.
8. Create More Structured Learning Opportunities: Actively involve EEs in core evaluation functions, such as developing Terms of Reference (TORs) or responding to Requests for Proposals (RFPs), and ensure they receive training in key technical competencies like data management tools and M&E frameworks. This will prepare them for diverse roles and build a solid foundation for their professional development
The SAMEA Conference 2024 drove home an important point: for Emerging Evaluators (EEs) to truly thrive, the sector must make a conscious effort to invest in their growth. It’s not just about offering exposure—it’s about building competencies, providing mentorship, and ensuring a clear path for career progression. By focusing on these areas, we can cultivate a pipeline of skilled, confident evaluators who will drive inclusive, innovative, and transformative evaluation practices.
The discussions at SAMEA left me feeling both inspired and challenged—not only reflecting on my own journey but also on our collective responsibility to nurture and support the next generation of evaluators.
Â
Article by Ruvimbo Gombakomba – Emerging Evaluator at SAMEA (2024)