Evaluator Competencies: Lessons from the Launch of SAMEA’s Self-Assessment Competency Tool

Blog by Dineo Madiba – Emerging Evaluator at SAMEA (2024)
Abstract
Over the past decade, South Africa has witnessed remarkable growth in the Monitoring and Evaluation (M&E) profession. The development of evaluator competencies has significantly expanded the stature of the profession within the country.
These competencies, developed by the Department of Monitoring and Evaluation (DPME), the South African Monitoring and Evaluation Association (SAMEA), and scholars, have contributed to maturing evaluation practice.
This paper reflects on the history of South Africa’s M&E sector, the development journey of evaluator competencies, and the launch and pilot testing of SAMEA’s online Evaluator Competencies Self-Assessment Tool during the SAMEA 9th Biennial Conference (2024).
Introduction and Background
The emergence of South Africa’s M&E sector traces back to the early post-Apartheid era. The new democratic government inherited a service delivery system characterised by administrative divisions, disintegrated performance systems, and fragmented data structures (Abrahams, 2015).
The White Paper on the Transformation of the Public Service (1995) first introduced M&E as a strategy to deliver equitable, quality public services. However, the sector was initially underdeveloped, with a heavy focus on monitoring and limited attention to evaluations (Podems et al., 2014).
Early momentum to develop evaluator competencies came from outside government. Founded in 2005, SAMEA became a key actor in promoting evaluation as a profession and building practitioner networks (Abrahams, 2015).
Despite early advocacy efforts, government buy-in for evaluator competencies was initially limited. Over time, the establishment of the Department of Performance Monitoring and Evaluation (DPME) and frameworks like the National Evaluation Policy Framework (NEPF) signalled growing commitment. However, challenges such as capacity constraints persisted (Goldman et al., 2018).
The subsequent collaboration between DPME, SAMEA, and CLEAR AA led to the development of South Africa’s Evaluation Competency Framework (ECF), which standardised evaluation expectations across sectors.
Academics also contributed to these efforts. Most notably, Rhoda Goremucheche, under the guidance of Dr Lauren Wildschut, developed an evaluator competencies framework through a participatory process that involved diverse stakeholders (SAMEA, 2020).
This framework reflects the South African context and the principle of Ubuntu—“I am because of who we all are”—and categorises competencies into five domains:
-
Professional Practice
-
Technical Practice
-
Context
-
Managing an Evaluation
-
Interpersonal Skills
While significant progress has been made, South Africa continues to lag behind international counterparts like Australia, Canada, and the United States (Gullickson et al., 2024).
SAMEA’s Self-Assessment Competency Tool
At the SAMEA 9th Biennial Conference (October 2024), a new online Evaluator Competencies Self-Assessment Tool was launched under the strand “Competencies of M&E Practitioners as Catalysts of Change in the Time of VUCA.”
The tool aims to:
-
Facilitate self-assessment of competencies
-
Inform capacity building efforts
-
Support professional development among M&E practitioners
Key influences for the tool’s design included the SAMEA competency framework, DPME evaluation standards, and the Australian self-assessment model.
Feedback from the Pilot Test
During the session, participants were given 15 minutes to trial the tool and then provided feedback. Key insights include:
Strengths
-
User-friendly and essential: Participants found the tool intuitive and valuable for self-assessment.
-
Immediate feedback: The report card feature was well received.
Areas for Improvement
-
Length and Repetition: Some participants felt the tool was too lengthy and repetitive, making completion time-consuming.
-
Monitoring versus Evaluation: There was concern that the tool overemphasised evaluation competencies while neglecting monitoring.
-
Content Gaps: Participants highlighted missing competencies, such as:
-
Evaluative thinking
-
Gender sensitivity
-
Cultural awareness
-
Soft skills (e.g., communication and teamwork)
-
Concerns About the Rating Scale
Participants questioned:
-
How expertise is defined
-
Whether years of experience automatically translate into expertise
It was suggested that further refinement is needed to distinguish between experience and actual competency.
Recognising Heterogeneity in M&E Professions
The tool was criticised for assuming a one-size-fits-all model. Participants recommended recognising that different M&E roles (e.g., programme evaluators, data analysts) require different sets of competencies, necessitating greater flexibility in the tool.
Capacity Building Opportunities
While the tool identifies competency gaps, participants suggested linking the assessment outcomes to actionable learning pathways (e.g., targeted training programmes) to support professional growth.
Conclusion
The development and launch of SAMEA’s Evaluator Competencies Self-Assessment Tool marks a significant milestone in professionalising M&E practice in South Africa.
The feedback from pilot testing provides valuable insights for refining the tool to better reflect practitioner realities. Improving face and content validity, incorporating monitoring competencies, and acknowledging diverse career pathways will strengthen the tool’s relevance and impact.
Ultimately, the tool represents an important step towards enhancing the credibility of the M&E profession and promoting continuous professional development within South Africa’s evolving evaluation landscape.
Most Recent Read
Discover more topics