- Количество слайдов: 77
The process of developing a monitoring and evaluation system WELCOME !! Southern Hemisphere Consultants Nana Davies, Dena Lomofsky and Andries Mangokwana Phone : +27 21 422 0205, Fax : +27 21 424 7965 PO Box 3260, CAPE TOWN, 8000 www. southernhemisphere. co. za
Workshop outcome Having completed the course the participant will have: Knowledge of the difference between M&E An understanding of the importance of integrating monitoring, evaluation and reporting into project planning, and understand where they fit into the project cycle Basic knowledge of how to develop objectives, outputs, activities, indicators and means of verification
Workshop outcome Basic knowledge of data collection and analysis techniques, and their relationship to monitoring and evaluation Basic knowledge of how to develop a monitoring and evaluation system Knowledge of what process to follow to develop a participatory monitoring and evaluation system Shared experience, challenges and learnings on how to develop a monitoring and evaluation system
Workshop outcome Participants will also be able to: Be identify and develop “SMART” objectives and “DVARP” indicators Develop a simple monitoring and evaluation framework
The PURPOSE of monitoring & evaluation is to ASSESS … Common understanding of project activities, strategies and timelines Ongoing project activities Whether or not the project is being conducted as planned Extent to which the objectives have been met Whether these objectives are contributing towards achieving the stated Development Objective Impact on project on individuals, organisation and communities Whether or not the project is addressing the needs of the stakeholders
Benefits of Monitoring and Evaluation are to improve or inform. . Planning and implementation Evidence based policy Decision making Learning from experience Accountability and transparency Capacity Building - identifying gaps and needs for skills development, mentoring, etc.
Definitions - Monitoring & Evaluation MONITORING IS … the continuous, methodical process of data collection & information gathering throughout the life of a project so that corrective action can be taken EVALUATION IS … a systematic & objective process that periodically assesses a project against certain standards of acceptability.
Planning, monitoring & evaluation in the project cycle IDEA IMPLEMENTATION CONCEPTUALISATION MONITORING EVALUATION REPORTING PROJECT DESIGN FEASIBILITY STUDY NEEDS ASSESSMENT PROJECT DEVELOPMENT
What is a Monitoring & Evaluation System ? ? ? Has 3 main elements: 1) M&E Plan 2) Supporting M&E documents (development of forms, reporting formats and guideline for M&E plan) 3) Institutional arrangements
Component of a M & E system
Elements of a monitoring plan Objectives Information Use Indicators Monitoring & Evaluation Plan Data collection Methods Budget Reporting methods Data analysis methods
The Monitoring & Evaluation Plan… Is a tool used to plan and manage the collection, analysis and reporting of data related to an indicator Consists of the following elements: Stated SMART objective (s) (relevant to project level) Clear, verifiable and reliable indicators directly related to the objective Source of the information on the indicator (Mo. V/Form) Identification of who will be responsible for completing Mo. V (who will provide it? )
Elements of an M&E plan continued… Person responsible for collecting Mo. V (who will collect it? ) Frequency of data collection (how often and when) Person responsible for analysing the data (who will analyse) Person responsible for reporting on the data (who will report) Method of analysis (how will it be analysed? ) Disaggregation? To measure whether indicator (and ultimately the objective) has been achieved Using which analysis tools e. g. SPSS, Nvivo?
Elements of an M&E plan continued… Person responsible for reporting Reporting requirements (who will need it, for what purpose and how often? ) Reporting (how will I present the information? ) Usage (how will the information be used? ) Budget (estimate the costs of collecting, analysing, and reporting performance data for each specific indicator or group of indicators. Specify the source of funds. ) REMEMBER TO UPDATE PLANS AS NEEDED TO ENSURE SCHEDULES & ASSIGNMENTS REMAIN CURRENT & REFLECT PROGRAMME/PROJECT ACTIVITIES.
Why are Monitoring and Evaluation Plans important? Planning, managing & documenting data collection Ensures that comparable data (data that you can compare) will be collected on a regular Keeps information constant and available even when personnel changes Timely collection of data
Why are Monitoring and Evaluation Plans important? … Assigning responsibilities Helps to think through all the steps that make up your M&E system e. g. objectives, indicators, data gathering techniques etc. Helps to keep you ON TRACK, and make sure that the data collected is USEFUL for DECISION MAKING.
7 Components Of Good M&E Design Clear statements of measurable objectives 1. A structured set of indicators 2. covering outputs of goods & services generated by the project & their impact on beneficiaries Provisions for collecting data and managing project records 3. so that the data required for indicators are compatible with existing statistics, & are available at reasonable cost Institutional arrangements 4. for gathering, analysing, & reporting project data, & for investing in capacity building to sustain the M&E service Learning 5. Proposals for the ways in which M&E findings will be fed back into decision making Participation 6. 7. for the project and its components, for which indicators can be defined transparency of decision making & relationships Be aware of diversity
INFLUENCE AREA CONTROL AREA
Results based Goal Results Area. Aligned to strategic plan, organisational vision, mission. Influence area Outcomes Outputs Control area Activities Inputs Implementation Area. Aligned to Operational Plans, Annual Performance Plans.
Control and influence Goal / Impact Beyond direct management control What the project is contributing towards Outcomes Outputs Activities Within direct management control What the project can be reasonably accountable for achieving Inputs Control Influence and accountability
The LFA Matrix Project elements 1. 2. 3. 4. Description Verifiable Indicators Means of verification Assumptions Development Objective Immediate Objective Outputs Activities Inputs From Immediate objective to Development objective From Outputs to Immediate objective From Activities to Outputs
Setting Objectives Step 5 Goal / Impact (Also called Development Objective, Long-term or Overall Objective). A higher level objective to which the Outcomes are expected to contribute Must represent sufficient justification for the project Specifies the benefits, which the beneficiaries will enjoy Beneficiary group are often defined
Smart objectives S pecific M easurable A chievable R ealistic T imebound
Outcomes (Also called Purpose, immediate objective or Short-term Objective). The objective to which the immediate project results are expected to lead. Denotes a specific change to be achieved at and after the conclusion of the project. Expresses the action that the target group will take in order to bring about the desired change. Describes a change in the target group’s behaviour, resulting from its use of the services or products provided by the project. Must be realistic i. e. it is likely to occur once the project outputs have been produced. Formulated as a desired state, not as a process. It must be formulated as SMART.
WHAT ARE YOUR Outcomes? Do they? : Specify the changed situation you want to achieve by end of project? Specify target group? Read as SMART… Specific Measurable Achievable Time bound Relevant
Outputs The results, needed to obtain the Outcomes can be guaranteed by the project as a consequence of its activities. Formulated as an end result, not as an action. Precisely and verifiably defined. SMART The Outputs should be fully achieved by the project management.
Activities The necessary tasks in order to produce the desired outputs. The activities must be stated in terms of actions (to train. . , to lobby. . . , etc. ). All essential activities performed by the project should be included. All activities must be connected to an output. Inputs The project resources, i. e. goods, services and personnel necessary for the activities to take place and the outputs to be produced.
Exercise: Define the description column of the matrix for your project.
Indicators and the Project Planning Matrix (PPM) (LFA) Level of objectives M&E is interested in … Development Impact (BENEFITS of a Objective sustained adoption) Influence Immediate Objective Impact (reaction of beneficiaries to what the project is delivering, that also gauges the affects of adoption) Outputs / Results Effectiveness (the effective delivery of project services or interventions Activities Effectiveness, efficiency (with which the project undertakes activities) Inputs Effectiveness, efficiency Control
Step 7 Indicators can be seen as signs or markers that tell us how we are progressing to meet our objectives. Are we on track? Indicators MUST be: Directly related to the objectives Verifiable (Prove it) Adequate (Enough but not too many) Practical to measure (simple admin and research methods) Reliable (measure what you want to measure)
When developing indicators, take each objective & output (results) & ask yourself: HOW DO I KNOW IF THINGS ARE GETTING BETTER OR WORSE? WHAT QUESTION DO I NEED TO ASK IN ORDER TO FIND OUT IF THE OBJECTIVE HAS BEEN MET? INDICATORS CAN BE: Quantitative – ( how much, how many) Qualitative – ( Texture, how things are)
QUANTITATIVE EVALUATIONS TEND TO STRESS: Objective numbers that ‘speak for themselves’ Controlled measurement Quantifiable results Numbers (600 people trained), percentages (percentage of participants graduating), rate (decrease in the infant mortality rate), ratio (one teacher per 30 students) QUALITATIVE EVALUATIONS TEND TO STRESS: Observations that show changes in situations, behaviour, feelings & attitudes Observations about processes Interpretations of situations Monitoring and evaluation often use both qualitative and quantitative indicators.
HOW MUCH DETAIL OR INFORMATION? When planning the information needs of a project, there is a difference between day to day management & monitoring of project process (detail needed) limited number of key indicators needed to summarise overall progress.
For example During construction of village tubewells, project managers will need to keep records about: The materials purchased or consumed The labour force employed & their contracting details The specific screen & pump fitted The depth at which water was found, & the flow rate The key indicators might be the number of wells successfully completed their average costs flow rates.
Selecting indicators Not practical to use ALL indicators They key indicators should be selected on the basis of: their ability to best reflect the attainment of the objective of the evaluation the perceptions of the stakeholders. You may have to adopt a process of negotiation between stakeholders to finalise indicators.
Setting Baselines A baseline is the measure on an indicator prior to the implementation of a programme/project. Why baselines? provides an indication of the situation before an intervention allows to aim for a realistic target against which progress can be measured Challenges: Not always practical and appropriate to measure (e. g. for qualitative indicators) May be difficult to obtain, and may require an evaluation = more resources needed
Setting targets A target is the desired measure on an indicator that you are aiming to achieve after your programme/project has been implemented or has had the desired impact. Why set targets? Guides towards achieving a goal/outcome/impact/output Motivates you Gives details on numbers that need to be achieved within a time period When setting realistic targets one should consider: measure on the baseline current resources available and what can be achieved within the constraints Finding balance between being ambitious and setting targets that are easily achievable
Recording Current Status The measure on an indicator at a specific point in time when information on the indicator is collected. Why record current status? Allows one to gauge how far along you are to achieving target Makes current status of the indicator readily available It can be used to motivate staff to see what has been achieved thus far, and how far they still have to go
What is the difference between BASELINE STUDY & SITUATIONAL ANALYSIS ? ? ? Situation analysis: broader study of the problems related to the programme intervention Baseline study: collects very specific measures on indicators before an intervention Baseline study and situation analysis can be combined
What is the difference between TARGET & BENCHMARK ? ? ? A benchmark is a standard against which one can assess achievements. This standard can be based on what has already been achieved by other similar organizations. A benchmark is thus a tool used for setting targets.
What is the difference between TARGET & MILESTONE ? ? ? Target desired measure on an indicator at end of the project Milestone determines how far along one should be at different points in time during the project Sub-targets indicating the planned progress at different points in time
Indicator Matrix with baseline and target Objective / Output Goal / Impact Outcomes Output 1: Indicator Disaggre gation Baseline Target Current status
Means of Verification How are we going to collect the information on the indicator? Where can we find indicator? Establish at least one MOV for each indicator Can the indicator be realistically measured with reasonable time, money & effort.
MOVs and data gathering techniques PROJECT DOCUMENTS (Mo. V) attendance registers field officer reports time sheets financial record systems mailing lists minutes of meetings Test scores DATA GATHERING TOOLS Focus groups Interviews (in-depth, semi-structured, structured) Self completion surveys Quali-quant surveys Workshops Observation Plays
Data gathering techniques Quantitative information 1. 2. 3. Structured Questionnaires Qualitative information 1. In-depth interviews 2. Semi-structured interviews 3. Focus groups 4. Workshops Tests Existing data bases When to use: Need numbers (percentages, Need to understand how, Need statistically reliable Want stories, quotes, case Need large sample sizes Want to explore ratios) information Need to track, rate, rank, score what why? studies
Planning for data collection For each of the indicators developed in your indicator tool – write down the means of verification Indicator How will I Who will get the provide it? collect it? information? (MOV) When will it be collected?
Plans for data analysis Plan in advance how data for indicators will be analysed. In the “How to analyse” column describe: The data analysis techniques & data presentation formats (e. g. SPSS, Nvivo) The content of your analysis (compare objective with indicator and disaggregation) Consider if & how the following aspects of data analysis will be undertaken: Comparing disaggregated data –plan how it will be compared, displayed, & analysed (e. g. gender breakdown, age breakdown)
Comparing current performance against: past performance planned or targeted performance other relevant benchmarks Analyzing relationships among indicators – e. g. : How will cause-effect relationships between an objective & a targeted results framework be analyzed? How will a set of indicators (if there are more than one) for a particular objective be analyzed to reveal progress? How will activities be linked to achieving objectives and results? Analyzing cost-effectiveness. When practical & feasible, plan for using data to compare systematically alternative programmed approaches in terms of costs as well as results.
Disaggregated data; variables e. g. sex, age Plan for Data Analysis Indicator Means of Verification Who will analyse it? How will it be analysed? When will it be analysed?
Planning for reporting To complete the cycle of data collection, the results must be communicated and reported to the relevant stakeholders. It is useful to plan for reporting in terms of Who will need what information? How will it be presented to them? e. g. Project Managers quarterly and annual reports, fieldworkers reports, external evaluation report etc. Who will present it to them? Different stakeholders have different information needs. E. g. Project manager How often? E. g. annually, quarterly, monthly, start of project, end of project.
Planning for reporting Indicator Means of Verification Who will need the report? How will it be presented? (Report format) Who will present it? When will it be presented?
Monitoring and evaluation system A monitoring and evaluation plan is A tool used to plan & manage the collection of data and plans for analysis, reporting & use
Elements of a monitoring plan OBJECTIVES Understanding of stakeholders Definition of indicators Monitoring Plan Data collection Methods Budget Reporting methods Data analysis methods
s ns a a ur pl le & te yo hedu upda ur sc o & ber t sure yo m rent t e r Rem d to en main cu rojec eede ments re amme / p n r n ssig the prog a t eflec es. r tiviti ac
Supporting M&E documents Consists of forms, reporting formats and guideline for the M&E plan Purpose is to “make the M&E Plan alive” Start by making lists of the forms & reporting formats that must be developed (from Mo. V column in the M&E Plan)
Exercise: In groups look at the case study’s M&E plan and develop lists of forms and reporting formats. Step 1: Look down the MOV column and make a list of the forms as they appear Start from Outcome section, not goal Step 2: Number the forms Step 3: Write down the indicator number
Step 4: Design of forms Step 4 Forms are the Mo. V or the source of the information on the indicator that need to be completed When you develop the form, you look at the following columns in the M&E plan: The actual indicator, Who should fill in the form (from name of the form) How the data should be analysed (disaggregation) Design a standardised header for all forms including information such as: the form title, who will gather the form, supporting documents (e. g. should anything else go with the form or should it be attached to another form or report format? )
Exercise: In groups and based on the case study, develop 1 of the forms
Step 5: Design of reporting formats Step 5 A reporting format consist of a range of combined forms Once you have developed all the forms, you need to develop the reporting formats. It often helps to draw which forms leads to which reporting formats
Step 5…continued The person responsible for reporting can inform the name of the reporting format Need to look at the relevant forms that lead to the reporting format
Step 6: Guideline for the M&E Plan Step 6 Why is it important? The M&E Plan can be quite confusing, complicated to understand, difficult to access and M&E Plan can be alienating for those who encounter it the first time Guideline document ensures that those responsible for implementing the system can understand the indicator, and that data will be analysed and interpreted in standard ways What is its purpose? To ensure that the data related to the indicator is calculated and measured in the same manner every time it is reported on.
Step 6…continued What should it contain? Indicator definitions and analysis guidelines Description of tools Any other information relevant to the M&E plan that may have been developed e. g. contact details, relevant policy / strategy documents etc.
Indicator definitions and analysis guidelines Should contain: Glossary of common terms (e. g. HIV and AIDS, Facilitator) Description of the indicator Definition of the indicator Relevance of the indicator Types of disaggregation Source of indicator Methodology in obtaining the data Formula (or how to calculate it) Possible problems in obtaining the data or limitation to the methodology
Component of a M & E system
Step 7: Institutional Arrangements Step 7 What are institutional arrangements? Roles and responsibilities Electronic database system Budgets and resources Capacity building Continuous improvement on the system
Roles and responsibilities Plans for data gathering, analysis and reporting begins with identifying roles and responsibilities need to be formalised and included in Key Performance Areas of individuals and units
Electronic database system Data gathered by M&E activities can be managed by using an electronic data base Design of the data base informed by the data gathering, analysis and reporting plan M&E plan needs to be developed first BEFORE electronic system can be designed Allows for systematic and speedy management of large amounts of information Can assist with generating automatic reports. The M&E team needs knowledge and skills to manage and maintain the database
Budgets and Resources Organisation needs to have the correct skills for monitoring and evaluation, (can be internal or external) M&E function needs to have the following skills: working knowledge of M&E -can be acquired once appointed theoretical knowledge of M&E - acquired from short courses or formal qualifications understanding of government systems and procedures research skills (particularly data analysis skills) data base skills capacity building and facilitation skills for training, mentoring and coaching people in the use of the system
Capacity building for M&E Everyone in the organisation should understand the importance of monitoring and evaluation, as they: will be the ones who will have to maintain the system will be using the results to make decisions for improvement
Continuous improvement of the system must be piloted & tested the indicators, means of verification and supporting system elements can be refined and improved over time. ensure quality control of the system
Challenges in developing M & E systems: People do not participate in M&E development which may cause some resistance & misunderstanding People are reluctant to engage in an M&E system People do not understand the M&E system No data or incorrect data gathered by management or implementers Data fed back to management for planning not taken seriously
Hints for a successful M&E system The organisation must have conceptualised the project properly before a monitoring and evaluation system can be developed. Keep in mind that sometimes there can be more than two M&E systems per organisation e. g. implementing organisations and multi-partner. It is important to stay focused on what information needs to be gathered for the M&E system as oppose to what could be interesting to have.
Hints for a successful M&E system When developing a M&E system ensure that the system is manageable and data can be collected Ensure that the organisation has the capacity to analyse the data The organisation must have mechanisms to feed back results from M&E to management
Hints for a successful M&E system The organisation must make decisions on what they want to do with the M&E reports. How do they ensure that the issues arising are taken up and used to improve programme implementation? The analysed data results deriving from the M&E system needs to be disseminated to the project implementers The design of a M&E system often reveals organisational development problems
Hints for a successful M&E system It is crucial that there is buy in to the process. M&E tools require a lot of work and can often feel imposed. Therefore it is important that the right people participate in the design workshop Ensure that the organisation does not develop too many tools and indicators The people applying the monitoring and evaluation system must be trained in the use of the tools
Hints for a successful M&E system The organisation should budget (time, money and human resources) for monitoring and evaluation. (How much? ) If you are working as a consultant, build capacity and transfer skills to someone in the organisation to carry it forward.