M&E Health Check

In practice, the processes that underpin the need for relevant, accurate and timely monitoring and evaluation (M&E) information involve iterations of an “M&E Data Cycle” comprising the following stages:

  • Identify: nominating/defining the nature of the information required by the various internal and external stakeholders
  • Capture: obtaining the identified data in practical and efficient ways
  • Analyse: subjecting the captured data to appropriate and meaningful analysis
  • Disseminate: ensuring that meaningful information relevant to the needs of the various stakeholders is supplied in a timely and useable form
  • Utilise: implementing responsive decision-making, critical inquiry and reflection, and transparent disclosure in order to foster accountability and learning
  • Assess: reviewing and judging the extent to which the identified M&E data has indeed fostered improved performance (i.e. M&E of M&E—or ‘meta-M&E’)

The M&E Data Cycle should be defined for each level in the project design logic (i.e. each stage of the ‘theory of change’).

Each stage of the M&E Data Cycle is potentially problematic for aid project implementing organisations, thereby eroding the anticipated benefits of M&E. The following assessment tool can be used to pinpoint areas of weakness at each stage of the M&E Data Cycle.

Question Criteria
(1)Consistently fails to meet expectations
(2)Occasionally fails to meet expectations
(3)Consistently meets expectations
(4)Occasionally exceeds expectations
(5)Consistently exceeds expectations


1A coherent conceptual framework (e.g. logframe) is applied to ensure that all M&E data needs are comprehensively met1.
2Project implementation team are involved in defining the M&E system
3Direct beneficiaries (‘boundary partners’) of the project are involved in defining the M&E system
4Ultimate beneficiaries of the project are involved in defining the M&E system
5Collaborating/strategic partners are involved in defining the M&E system
6Data to verify the quality of project deliverables is identified
7Data is identified to verify the efficiency of project implementation work
8Data to verify the efficacy of project interventions is identified
9Data is identified to verify the effectiveness of the approaches/strategies that underpin project designs
10Data is identified to verify the social, economic and ecological sustainability of interventions
11Data is identified to determine the prevalence and impact of risks affecting the work of the implementation team (‘management risks’)
12Data is identified to determine the prevalence and impact of risks affecting the initial success/uptake of initiatives by direct beneficiaries (‘intervention risks’)
13Data is identified to determine the prevalence and impact of risks affecting the overall impact of the project among the ultimate beneficiaries (‘development risks’)
14A M&E plan identifies the sources/subjects of all M&E data
15A M&E plan defines the persons/roles responsible for capturing the identified data
16A M&E plan defines the nature of analysis to which all captured data will be subjected
17A M&E plan clearly articulates the persons/roles to which analysed data will be supplied/disseminated
18A M&E plan defines the schedule for data capture, analysis and dissemination
19A M&E plan clearly articulates how all M&E data will be utilised


20Appropriate/informed ‘subjects of inquiry’ (data sources) are sampled
21Data capture tools and protocols are unambiguous for all people involved
22Data capture tools and protocols are simple to implement
23All staff have been adequately trained to comply with reporting requirements
24Staff compliance with defined data capture is strong
25The amount of time and effort invested by staff in data capture and reporting is appropriate
26Where data is captured directly from beneficiaries/communities, methods used are ethical and unimposing


27All data captured is analysed promptly
28Key persons/roles are assigned the responsibility for data analysis
29Persons/roles who analyse data have the necessary skills
30Analysis is accurate
31Analysis draws out meaningful features and trends in the data
32Analysis includes identification of variance between planned and actual performance
33Analysis examines both states (absolute measures) and trends (relative changes)
34Lessons learned are systematically documented


35Analysed data is disseminated to pre-defined stakeholders
36Analysed data is disseminated promptly within agreed timeframes
37Disseminated data is in a format that is relevant/accessible
38Reporting staff receive feedback on data captured
39Feedback includes value-adding such as trend analysis
40Analysed data is segmented to ensure relevance to data clients


41M&E findings inform operational decisions (project implementation)
42M&E findings inform tactical decisions (program management)
43M&E findings inform strategic decisions (organisational direction)
44Key persons/roles are held accountable for utilising disseminated data
45Individuals within the organisation reflect on M&E findings
46Teams within the organisation collectively reflect and debate M&E findings
47Plans are adapted/modified in response to M&E findings
48Project M&E findings are shared with other projects to promote organisational learning
49Feedback of analysed data is used to inform field-based practice
50M&E findings are used to inform planning of future projects
51M&E findings are used to comply with stakeholder reporting requirements
52M&E findings are widely available to ensure transparency


53Time and resources are set aside to periodically review the M&E system
54The time and resources consumed by M&E processes are reviewed
55The extent to which M&E data is serving its intended purpose is reviewed
56The extent to which M&E findings are utilised to improve accountability and learning is reviewed
57M&E tools or protocols are periodically revised/improved
58The usefulness of M&E findings to each class of data client is periodically reviewed