2ea73003e47aad6c461580cf958618f4.ppt
- Количество слайдов: 24
Building Capacity for effective government wide Monitoring and Evaluation ♦ Dr Mark Orkin and Mr Oliver Seale ♦ SAMEA Conference ♦ Friday, 30 March 2007
Presentation Structure ♦ Strategic Objectives and Outputs (Mark Orkin) ♦ Transformation to the Academy (Mark Orkin) ♦ Capacity building for Monitoring and Evaluation (Oliver Seale) Objectives ♦ To provide an overview of SAMDI’s current focus areas and plot its future course. ♦ To explore and engage with SAMDI’s capacity building mandate for government wide monitoring and evaluation. 2
Strategic Objectives and Outputs Governance & Administration (G&A) Cluster Priorities ♦ Good Governance: Anti-corruption, Gender and disability, Batho Pele and Public Participation programmes. ♦ Capacity of the State: Local Government Strategic Agenda, Skills Assessment and Capacity Building programmes. ♦ Macro-organisation of the State: Single Public Service, Integrated Service Delivery and E-Government Service Delivery Projects. ♦ Transversal Systems: Integration of Planning and Government-wide Monitoring and Evaluation System. 3
Strategic Objectives and Outputs SAMDI 2007/08 to 2009/10 ♦ Develop and administer a training framework for curricula and materials. ♦ Co-ordinate the provision of executive development programmes for the Senior Management Service. ♦ Develop and implement a quality management and monitoring system. ♦ Capacitate departments to identify their human resource development needs. ♦ Establish and maintain partnerships and links with national and international institutes and training providers. ♦ Arrange customised training programmes to support foreign policy on the African Union (AU) and the New Partnership for Africa’s Development (NEPAD). 4
Strategic Objectives and Outputs Training Statistics: 1 Apr. 2006 – 28 Feb. 2007 5
Strategic Objectives and Outputs PTDs* delivered in Provinces (to end-Jan. 2007) Distribution of Government Employees per Province Distribution of PTDs per Province 14% 20% 18% 5% 31% 5% 10% 12% 14% 6% 6% 10% 5% 2% 11% 18% 2% 3% 3% 5% * PTDs = Person Training Days 6
Strategic Objectives and Outputs Focus areas for 2007/08 and beyond ♦ Support JIPSA (Joint Initiative On Priority Skills Acquisition) policy formulation and training. ♦ Incubate AMDIN (African Management Development Institutes’ Network) and DRC (Democratic Republic of Congo). ♦ Contribute to the ASGI-SA (Accelerated and Shared Growth Initiative for South Africa) through concentrated public sector human resource development activities and operations. ♦ Transformation to the Academy – refer detail slides. ♦ Capacity building for Monitoring and Evaluation – refer detail slides. 7
Transformation to the Academy SAMDI: Need for a paradigm shift ♦ R 1 billion p/a spent in departments but 43% of staff in provincial departments reported no training in 2006. ♦ International benchmarks suggest at least 5 days training per annum: v For approx. 250, 000 middle and junior managers requires 1, 25 million PTDs p. a. ; v Allowing for 60% of training already occurring in departments still requires 0, 5 million PTDs p. a. ♦ For induction, staff turn-over is 120 000 people p. a. requiring another 0, 2 million PTDs. ♦ Thus, the total-demand driven requirement is 0, 7 million PTDs: nearly 10 times SAMDI’s present output! 8
Transformation to the Academy Vision and activities ♦ Three “mantras” v Provision to facilitation. v Competition to collaboration. v Selective coverage to massification. ♦ First main stream of activity v Executive development programmes for SMS. v Entrant, lower and upper SMS: programmes, courses and events. v In collaboration with universities and counterparts. ♦ Second main stream of activity v “Massified” management training for junior and middle managers. v Training frameworks of curriculum and materials in conjunction with provincial academies and DPLG; v Monitoring and Evaluation to regulate providers; v The induction programme for new entrants at all levels. 9
Transformation to the Academy 2006/7 SAMDI outputs: basis of new approach MDT- Management Development Training; ELD - Executive Leadership Development; HRDT- Human Resource Development & Training; SCM: Supply Chain Management; HRMT - Human Resource Management Training; FPMT - Finance & Project Management Training; SDT - Service Delivery Training; ID - Institutional Development 10
Transformation to the Academy ENE training spend in national departments 11
Transformation to the Academy Induction Other Pensions Immigration Other Information Supply Chain Human Res. Finance Other Information Culture Projects People Performance levels Finance Learning framework: tentative harmonised modules Senior Middle Junior Supervisor Generic competencies Functional competencies Sectoral competencies 12
Transformation to the Academy Projects for Internal Task Teams 1. Audits of junior and middle management courses 6. Preparing for service provider mobilisation 2. Planning and implementing the training framework 7. Planning executive programmes for SMS level 3. Enhancing the monitoring and evaluation system 8. Recasting delegations and policies 4. Streamlining accreditation processes 9. Scoping operational system for outsourced training 5. Planning for massified induction programme 10. Conceive a knowledge management system 13
Transformation to the Academy Recap and way forward ♦ Executive development programmes. ♦ Learning framework for massified middle and junior management learning. ♦ Curricula and material development, guality assurance and accreditation. ♦ Provider and user relations; M&E of large-scale provision ♦ Provincial infrastructure. ♦ Research capacity and networking. ♦ Continental support for Management Development Institutes, international relations. ♦ Impending restructuring process. 14
Capacity building for Monitoring & Evaluation Background t Aims and objectives v The aim of the system is to contribute to improved governance and enhanced effectiveness of public sector institutions. v The system aims to collect, collate, analyse and disseminate information on the progress and impact of programmes. t Result Areas v Accurate information on progress in the implementation of public sector programmes is updated on an ongoing basis; v Information on the outcomes and impact achieved by government is periodically collected and presented; v The quality of monitoring and evaluation (M&E) practices in government and public bodies is continuously improved. 15
Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (A) What are the implications for training? 1. Little coherent or articulated strategy in provinces, though expenditure on expensive systems to collate M&E data. i. What would a coherent strategy need to contain? ii. What is an articulated strategy? What type of links are we looking for? iii. What systems are there? What do we mean by a system? iv. What data are there? How were they obtained? What is the quality of this data? v. What collation is taking place? How? Can a system collate data? 16
Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (B) What are the implications for training? 2. Monitoring programmes are just about collecting data, very little analysis and feedback given. i. What data, why and how are they being collected? 3. Alignment of plans doesn’t exist i. iii. iv. What planning does take place and how? How is M&E incorporated into planning? What do we mean by alignment why do we need it? Is alignment always possible and necessary? 17
Capacity building for Monitoring & Evaluation Provincial needs analysis: example of feedback (C) What are the implications for training? 4. Planning without indicators i. What type of indicators do we mean and how should these be developed? ii. What indicators do exist and how are they measured? iii. How are they decided on? 5. Lack of or poor baseline data i. What baseline data do exist and do we evaluate it? ii. What type of baseline data are required? iii. How are these presently obtained? 18
Capacity building for Monitoring & Evaluation Conceptual framework for training Planning 1. 2. 3. 4. 5. What will be done (strategy) Why will it be done (policy) How will it be done (operations) Indicators and criteria (how to measure) When (timeframes) 1. 2. 3. 4. Description Existing data bases Data collection methods Baseline data Existing situation New project or programme Monitoring 1. 2. 3. 4. 5. 6. 7. 8. 9. System to be used (MIS) Indicators Methods Baseline data Inputs Tracking i. Processes ii. Activities Interventions and modifications Outputs Outcomes Evaluation 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. System to be used (EIS) Indicators Methods Baseline data Criteria Assessment Process Impact Lessons learned Feedback 19
Capacity building for Monitoring & Evaluation Training principles for various levels Principles Levels 1. Basic for general users of information Basic for project managers Intermediate for programme managers Advanced for executive managers Advanced for CFOs and DDGs Specialist technical training for M&E staff 2. 3. 4. 5. 6. Understanding of basic principles of M&E Applying principles to a specific project Applying principles to a programme Applying principles to overall management in departments Applying principles across departments/provinces Actually performing evaluations 20
Capacity building for Monitoring & Evaluation Target Audiences 1. Users i. iii. iv. Political heads and parliamentarians (incorporated into report-backs to portfolio committees) Accounting officers (DGs) Executive managers and managers in govt departments Users of the service or the information outside government 2. Producers i. iii. iv. Programme managers Project managers Operations staff Participants 3. M&E staff in national and provincial departments 21
Capacity building for Monitoring & Evaluation Examples of current provision Institution Programme Duration Level Content University of Stellenbosch Diploma in Monitoring and Evaluation Methods 1 year Post Graduate ♦ General principles & paradigms ♦ Clarificatory evaluation ♦ Process evaluation & programme monitoring ♦ Data collection methods ♦ Statistical and qualitative methods ♦ Impact assessment designs University of Cape Town Workshop on Monitoring and Evaluation 1 week All ♦ General principles ♦ Measuring Public projects Regenesys Short course in Monitoring and Evaluation 3 days NQF 4 ♦ Monitoring and evaluation concepts ♦ Results-based management ♦ Concepts of outcomes ♦ Project objectives and indicators ♦ Monitoring and evaluation system ♦ Team performance improvements ♦ Performance standards ♦ Risks and the impact thereof ♦ Success and failure factors ♦ Project evaluation reports 22
Capacity building for Monitoring & Evaluation Strategy and Plan of Action ♦ Progress Report v Terms of reference developed for Task Team. v 15 workshops on M&E for programme and project management (340 officials). v Initial needs analysis on provincial M&E capacity. v Consultation with key internal and external stakeholders. ♦ Plan of Action v Research on M&E training needs - SMS, MMS, JMS, practitioners (March ‘ 07). v Determine current providers of M&E - HEIs, privates, NGOs etc. (March ‘ 07). v Undertake training needs analysis for M&E (May ‘ 07). v Develop M&E training programme (Sept ‘ 07); roll-out (Nov. ‘ 07). 23
Siyabonga Thank you Rolivhuwa Dankie Nakhensa Re a leboga 24
2ea73003e47aad6c461580cf958618f4.ppt