98612d9df2ff0c941143446cce352bf6.ppt
- Количество слайдов: 80
Training Session Sponsored by the Association of Government Accountants A Systems Approach to Implementing Performance-Based Management and Budgeting Audio Conference March 21, 2012 1
Today’s Presenters ©Stephen L. Morgan, President, EGAPP, Inc. , and former Austin City Auditor can be reached at egappmorgan@yahoo. com ©Sam Mc. Call, City Auditor, Tallahassee, Florida and former Deputy State Auditor of Florida, can be reached at Sam. Mc. Call@talgov. com 2
A Systems Approach to Performance Based Management and Budgeting I. Introduction – Performance Accountability System II. Historical Overview – Where We Have Been in Austin and Beyond III. Performance Planning IV. Performance Budgeting V. Performance Measurement and Reporting VI. Performance-Based Decision Making VII. Conclusion – What We Have Learned and Where We Are Going 3
I. 1. Introduction: Government Performance Accountability System PLAN Strategic & Annual Planning DO ACT Performance-Based Decision Making Performance Budgeting CHECK Performance Measurement & Reporting 4
I. 2. Managing for Results Framework City of Austin PERFORMANCE-BASED DECISION-MAKING BUSINESS PLANNING • Citizens • Program/Activity Objectives • Council • Organizational and Individual Performance Measures • Managers • Employees • Individual SSPR Evaluations • Structural Alignment • Performance Targets • Accounting System • Organizational Performance Assessment • Performance and Measurement Audits PERFORMANCE MEASUREMENT & REPORTING PERFORMANCE BUDGETING 5
I. 3. Characteristics of a Successful System © Use existing data whenever possible © Find a balance between too few and too many measures © Audit the data regularly © Modify measures when necessary © Centrally located staff to analyze data and coordinate the system elements © Technological infrastructure to support the system 6
I. 3. Characteristics of a Successful System (continued) © Data forms should have space for explanatory information and detail © Tie measures to budgetary allocation and reward system © Support of top management © Over the long run should affect bottom line performance of the organization © Citizens will be better informed and more participative 7
II. 1. Where We’ve Been (in the City of Austin) … © 1992 – Council Resolution on Performance Measurement and Reporting © 1994 – First Performance Measurement & Reporting System Audit © 1996 – Second Performance Measurement and Reporting System Audit; Program Budgeting implemented © 1998 – Third Performance Measurement and Reporting System Audit 8
II. 2. Where We’ve Been… © 1998 Corporate Managing for Results Initiative Defined • Simplify our System • Clarify the Information We Provide • Develop Measures that are Meaningful to our Employees • Focus on Cost © 1999 Corporate Partnership Implements CMO Initiative • • Developed a Standard manual--The Resource Guide Trained over 200 managers Developed a Single Accounting System Identified Key Performance Measures for Executive SSPRs • Corporate Review Team 9
II. 3. Where We’ve Been… © 2002 Fourth Audit of the Performance Management System • • Ongoing Integrated System Information Used for Operational Management Measures Are Relevant and Reliable Budgets Are More Data and Results Driven © 2003 -2008 Continuous Improvement • Managers and Supervisors Fully Trained • Performance Measures Supported by More Robust Technology • Improvements Made to City’s Website and Stakeholder Access to Performance Information • Citizen and Employee Surveys Provide Data for Selected Performance Measures 10
II. 3. Where We Are Now… © 2008 -Current • Website Robust with Capacity to “Drill Down” and Search” through Performance Measures Database • “Managing for Results” Used as Business Planning and Performance Monitoring Model for More than a Decade--Now Part of City Culture • Performance Report on Website tracks 115 Key Departmental Measures, of these 21 are Designated Citywide Key or “Dashboard” Measures • Performance Comparisons Presented in Graphics with Goal/Targets and Measures Tracked Over Five Years • Performance Report for 2009 -2010 Received “Certificate of Excellence” from ICMA in October 2010 • Annual Citizen Surveys Strengthened to Include Focus Groups and Presentations to City Council • “Best Practice Citizen Centric” External Performance Accountability Report Is Needed 11
II. 4. Beyond Austin Federal Government Performance Management Continues to Evolve § Government Performance and Results Act of 1993 § Executive Order 13450 -Improving Government Program Performance, Nov 13, 2007 § OMB 10 -24: Performance Improvement Guidance under GPRA for 2011 -2012 § Government Performance and Results Act Modernization of 2010 (signed Jan. 4, 2011)
II. 4. Federal Agencies with Well Developed Performance Management Systems § § § Social Security Administration Department of Interior Government Accountability Office Nuclear Regulatory Commission Office of Personnel Management 13
Some Local, State, & Provincial Governments Have Established Performance Management Mandates
State and Local Governments With Well Developed Performance Management § States of Florida, Washington, Texas, Missouri, and Oregon (may have been recognized for individual State departments who are mature and excel in developing and applying performance management systems) § Local governments include Austin, King County, Phoenix, Bellevue, Charlotte, Portland, Palo Alto, and Tallahassee § Auditors have played key roles in many performance measurement and management initiatives
III. Performance Planning III. 1. Establishing programs, activities, and potential performance expectations III. 2. Developing annual business/performance plans with performance expectations and measures III. 3. Reviewing business/performance plans to support improvement and accountability 16
III. 1. Service Delivery System (Program Model) Input Other Contributing Factors Process Output Intermediate Outcome Long-term Outcome Community Impact
III. 1. Service Delivery System Cause/Effect Relationships Inputs Processes Outputs Service Efforts Financial Inputs/Outputs/Physical Inputs/Outcomes = = = Outcomes Service Accomplishments Unit Cost Productivity Cost Benefit and Cost Effectiveness 18
Government Performance Expectations MISSION PERFORMANCE OBJECTIVES/GOALS INPUT Economy & Sufficiency • Financial: Amount, timing • Physical: Quantity, quality, timing, price • Capacity demand PROCESS Efficiency OUTPUT Effectiveness OUTCOM E Effectiveness • Productivity • Quantity • Unit Costs • Quality: Products, delivery • Mission & Outcome Goal Achievement • Operating ratios • Timeliness vs. • Price or cost 19 • Financial Viability • Cost-Benefit • Cost. Effectiveness
III. 1. Service Delivery System: Auditing Program Audit Program or Activity Inputs • • Processes Staff • Audit Process. Funding (Survey, Equipment fieldwork, & Facilities/Rent reporting) Outputs • Reports • Briefings • Presentations Outcomes • Qualitative – Policy/system/ management improvements • Quantitative – Cost savings/ revenue enhancement • Preventive – Deterrence/ detection 20
III. 1. Program/Activity Mapping Template Inputs 21 Process Outputs (Services Delivered) Outcomes (Results)
III. 1. Austin’s Definition of Programs © Activity = Input Process Output Outcome © Program = group of activities with a common purpose Example: Audit Program consists of four activities: © Performance Audits © Investigations © Consulting and Assistance © Quick Response 22
III. 2. Overview of the Development of Business Plans Service Service Service Common Purpose = Activity A å Activity Objective Common Purpose = Activity B å Activity Objective Common Purpose = Activity C å Activity Objective Performance Measures Key Result A §Result §Output Performance §Efficiency Measure §Demand Environmental Scan Common Purpose = Program å Program Performance Measures Key Result B §Result Performance §Output Measure §Efficiency §Demand Objective = Results Accomplishment of · Key Result A · Key Result B · Key Result C Performance Measures Key Result C §Result Performance §Output §Efficiency Measure §Demand Common Purpose Service Service Service Common Purpose = Activity D å Activity Objective = Activity E å Activity Objective = Program Performance Measures Key Result D §Result §Output Performance §Efficiency Measure §Demand Performance Measures Key Result E §Result Performance §Output §Efficiency Measure §Demand Change Dynamics å G O A L S M I S S I O N Program Objective = Results Accomplishment of · Key Result D · Key Result E Change Dynamics Environmental Scan 23
III. 2. Business Plan Alignment Worksheet with Definitions ALIGNMENT WORKSHEET BY ACTIVITY Business Plan Element Results City of Austin Vision: We want Austin to be the most livable community in the country. Vision: (optional) Describes the desired future state or set of circumstances. Mission: Comprehensive statement of the Department’s purpose. Identifies Department’s primary customers and identifies the products or services that are provided. Goals: A broad statement describing the desired outcome for an organization or its programs. Defines the significant results to be achieved over the next 2 – 5 years. Program: Two or more activities grouped together to form a common purpose to define a program. Program Objective: Clear statement of the objective of the program. Program Results Measure(s) Key results this program is expected to achieve. Activity: A set of services with a common purpose that produce outputs and results for customers. Activity Objective: Clear statement of the purpose of the activity. Services that comprise the Activity: A service is a set of actions that produce a product, output, or result directly with or for customers. Activity Performance Measures: Results: Efficiency: Demand: Output: Responsible Employee: Department Executive/Manager responsible for Activity The impact that an activity has on customers/citizens Unit cost of an output The amount of services requested or expected by customers of the activity Units of services provided, products provided, or people served through the activity 24
III. 2. Sample Business Plan Alignment Worksheet Business Plan Element Results City of Austin Vision: We want Austin to be the most livable community in the country. Vision: (optional) Our community will be the healthiest in the nation. Mission: The purpose of the Austin/Travis County HHSD is to work in partnership with the community to promote health, safety, and well being. Goals: The over-all goal of the Austin/Travis County HHSD is to promote a healthy community which reflects social equity. This over-all goal will be achieved through: 1) Minimizing the public’s exposure to health and environmental hazards. Program: Environmental Health Services Program Objective: The purpose of the Environmental Health Services is to provide protection and enforcement service to the public in order to minimize environmental health hazards. Program Results Measure(s) ©Average response time to complaints/requests ©Confirmed cases of food-borne illness ©Percent of customers satisfied with complaint/request processing Activity: Health and Safety Code Compliance Activity Objective: The purpose of Health and Safety Code Compliance is to provide inspections, investigations, consultations, and training for the public in order to minimize public exposure to food-borne illness and other environmental health hazards. Services that comprise the Activity: ©Inspection services ©Investigation services ©Provide training to food operation employees Activity Performance Measures: Results: ©Confirmed cases of food-borne illness Efficiency: ©Cost per food establishment permit ©Average inspections/investigations per inspector ©Cost per food manager trained Output: ©Number of complaints/requests completed ©Number of food establishment, mobile food vendor inspections ©Number of temporary food inspections Responsible Employee: Donald Smith 25
III. 3. Reviews: Corporate Improvement and Accountability © Review Team • Budget Office, Organizational Development, City Management © Structure • Does it provide for alignment of results? • Does it permit illumination of results and cost information in a manner useful to decision makers? © Results • Do objectives and measures match? • Was template used for best impact? © Measurability • Are goals measurable? • Are program and activity measures useful? 26
III. 3. Plans: Consistent Process & Product © Program and Activity Objectives: MFR Template © Performance Measures: A Family of Measures The purpose of ________ is to provide__________ to ______________ so they can _________ Result Measure…then • Outputs: How many? • Efficiency: At what cost? • Anticipated Demand 27
III. 3. Activity Objective Statement (example) The purpose of the Combat Operations (program) is to provide/produce emergency incident response (service or product) to anyone in the service area (customer) in order to save lives and minimize property damage (planned benefit) 28
III. 3. Performance Measures (example) Result: Efficiency: Output: Demand: Number of fire deaths per capita Percent of fires confined to the room or area of origin after arrival of AFD (per census track) Average cost per call Number of calls (call volume) Number of fire alarms (calls) expected 29
IV. Performance Budgeting IV. 1. Link annual plans and budgets IV. 2. Establish targets IV. 3. Collect cost accounting information 30
IV. 1. Link Annual Performance Plans and Budgets © Ensure clear linkage between the plan’s programs and the budget’s programs © Ensure congruence between the plan’s goals, objectives, and targets and the budget’s goals, objectives, and targets 31
IV. 1. The Budget – Linking Results, $$$, and People © In the Budget Document • Business/Performance Plan • Activity and Program Pages • Performance Measures: definitions, etc. © Using the Performance Budget to “Tell Your Story” • Changing the Conversation • This Result…At This Cost 32
IV. 2. Establish Targets © Targets for each program and activity measure © Sources of criteria for setting targets • • • Historical trends and baselines Program requirements or intent Customer expectations or demands Industry or sector standards Benchmarking within the organization Benchmarking outside the organization 33
Sources of Performance Expectations --The process for identifying expectations and setting targets should be rigorous. --All sources have pros and cons so all should be considered when setting targets. www. Auditor. Roles. org 34
I. V. 2. Examples of Performance Targets and Measures Model Component Target (Expectation) Measure (Actual) Input Economy In FY 2011, decrease the purchasing Number of purchasing office’s personnel allocation by five positions deleted in FY 2011. positions. Process Efficiency In FY 2011, provide vehicle Average vehicle preventive maintenance services at maintenance unit costs in FY the unit cost $500 or less per vehicle 2011. serviced. Output Quality (accuracy) In FY 2011, reduce the restaurant critical inspection error rate by 10 percent. Output Quantity In FY 2011, expand “green energy” Number of additional homes electrical services to 1000 additional and businesses in FY 2011 homes and businesses. receiving “green energy. ” Output Timeliness In FY 2011, all Level 1 emergency calls will be responded to with a unit on site within six minutes. Response times (range) to Level 1 emergency calls in FY 2011. Outcome Effectiveness/ customer satisfaction In FY 2011, increase convention center customer satisfaction rate from 4. 5 to 4. 7 on a 5. 0 scale. Change in convention center customer satisfaction rate during FY 2011. . Percentage reduction in the restaurant critical inspection error rate in FY 2011. 35
IV. 3. Base program budgets on unit costs that support desired program outputs and outcomes as reflected in targets © Activity-Based © Identify Costing (ABC) Direct and Indirect Costs 36
IV. 3. Performance Budgeting ©Long-sought “ideal” of budgeting experts: ©Performance-driven budgeting. ©Best-case reality: ©Performance-informed budgeting. www. Auditor. Roles. org 37
V. Performance Measurement and Reporting V. 1. Individual Performance Appraisal V. 2. Organizational Performance Assessment and Reporting V. 3. Performance and Measurement Certification Audits 38
V. 1. Establishing Accountability Key Points of Business Plan Alignment/ SSPR Integration B Every employee in the organization contributes to the City Vision B Every employee in the department contributes to the Mission of the department. B Every employee in the department contributes to at least one Business Plan Goal. B The Alignment Worksheets show employees how the Services they provide support specific Activities, Programs, and Goals in the Business Plan. B Performance Measures show citizens, City Council and employees how well we are doing. B Every Business Plan Measure must be written into at least one employee’s SSPR. B Every employee, including department executives, will have at least one Business Plan Measure in their SSPR. 39
V. 1. Individual Performance Appraisal Alignment Worksheet Employee SSPR Mission Goals Program – Activity – Services that comprise Activity · · Program Objective Activity Performance Measures Activity Results Measure Activity Objective Results: Efficiency: Description of Services Demand: Output: Individual Performance Measure • Same as the Activity Performance Measure • Part of the Activity Performance Measure or, • Contributes to the Activity Performance Measure 40
V. 2. What is a Performance Monitoring System? Management Performance Goals Program and Levels Performance Indicators Intended Uses Data Component Data Collection Data Processing Analysis Component Measurement of Current Performance Levels Comparison of Current Performance with Criteria (performance goals) Action Component Decisions Concerning Goals Programs and Levels Decisions Concerning Monitoring & Evaluation 41
V. 2. Ensure Performance Measure Definitions/Formulas are Established Design monitoring system to track and analyze the selected measures (efficiency, outputs, and outcomes are essential). 42
V. 2. Ensure the Results of Performance Measures are Available for Analysis and Decision Making Design a reporting system that is easy to use, accessible to all interested parties, and enables management decisions. 43
V. 2. Establish Performance Reporting “Best Practices” Design reporting formats and decide frequency of reporting. Austin reports include: © Quarterly Performance Reports © Annual Performance Reports © Community Scorecard 44
V. 2. Use Performance Reports to Improve Performance © Use performance reports to identify and direct analysis of program performance © Use analysis to identify the causes of inadequate program performance and focus improvements on causes © Use performance reports to identify high performance programs 45
V. 2. 1. City of Austin Performance Report © Departmental Performance Measures • Total of 115 Measures Grouped into Public Safety, Community Services, Infrastructure, and Utilities/Enterprise Departments • Each performance graphic includes: Measure Description, Calculation Method, Results, Assessment of Results, Next Steps, and Contact for More Information 46
V. 2. 1. City of Austin Performance Report © Decisions influenced by: • Stakeholder/citizen priority or demand • Stakeholder/citizen satisfaction • Results shown • City Council and Management priorities 47
V. 2. 2. Why the City of Tallahassee Supports Citizen Centric Reporting © Government officials have a responsibility to be good stewards, to spend monies provided wisely, and to report financial and performance information back to citizens on accomplishments and challenges. © Citizen reporting has its history in Efficient Citizenship Theory ©Citizens are not the customers of government, they are the owners of the government ©Democratic government is best shaped by the choices of well informed citizens 48
V. 2. 2. Why is the City of Tallahassee Issuing a Citizen Centric Report? © We have a responsibility to inform our citizens about: ©What we are responsible for doing ©Where the money comes from that runs the City and where it goes ©What we have accomplished with monies received and expended , and ©What challenges face the City moving forward © We believe informed citizens make for better government 49
V. 2. 2. Issuance of Citizen Centric Reports and Media Coverage ©Four Citizen Centric Reports have been issued ©Received front page newspaper coverage ©Received television coverage ©Report page 3 data verified by City Auditor ©Citizen groups have received reports and have been requested to provide audit topic suggestions ©Reports are available in hard copy and online 54
V. 2. 2. Distribution of Citizen Centric Reports © City Neighborhood Associations © City Advisory Committees © Tallahassee Regional Airport © Community and Senior Centers © Other governments to include the County, FSU, FAMU, TCC © League of Women Voters © Leon County District Schools ©Middle School Civics Program 55
V. 2. 2. Purpose of the Citizen Centric Report • To Demonstrate: – Transparency – Accountability • To Promote – Dialog – Two way communication • To Build Trust – Over time – One citizen at a time 56
V. 2. 2. Building on to the Citizen Centric Report • Performance Measurement Report – Issued August 2011 – 83 pages – Includes all major departments – Report contents for each department • • • Mission statement, organization and services provided Goals and objectives Key performance measures Accomplishments and challenges Expenditures by program and line item – Report includes tables, graphs, charts, and photos 57
V. 2. 2. Website and Contact Information • Citizen Centric Report http: //www. talgov. com/auditing/pdf/citizenreport 2010. pdf • Performance Measurement Report http: //www. talgov. com/auditing/ • sam. mccall@talgov. com (850) 891 -8397 59
V. 3. Conduct Performance and Measurement Audits © Audit departmental and program performance © Audit relevance and reliability of performance measures 60
V. 3. 1. Auditing Government Performance Measure or assess performance during an audit or other study based on authoritative auditing standards. (See Austin, Florida OPPAGA, Kansas City, Phoenix on www. Auditor. Roles. org) — Identify the program’s inputs, processes, outputs, and outcomes — Develop and implement “ad hoc” performance measurement system — Using performance expectations as “criteria” and measures as “condition, ” analyze program performance — Identify causes of variances and develop audit recommendations www. Auditor. Roles. org 61
Available at: www. theiia. org/bookstor
V. 3. 2. Self Assess or Audit Performance Measures Using Asserted Criteria ©Relevance—Measures should be aligned, complete, and useful ©Reliability—Each measure and its data should be accurate, valid, and consistent www. Auditor. Roles. org 63
V. 3. 2. Test relevance or reliability. Assessing the Relevance of Performance Measures: Sample Criteria Measures should be: Aligned Linked to mission, goals, objectives Complete Includes essential aspects of performance Useful Timely Understandable Comparable Responsive to change Meets broad needs of users www. Auditor. Roles. org 64
V. 3. 2. Test relevance or reliability. Assessing the Reliability of Performance Measures & Data: Sample Criteria Each measure and its data should be: Accurate Computed correctly Neither overstated nor understated Appropriately precise Valid Corresponds to the phenomena reported Correctly defined Data & calculation comply with definition Unbiased Consistent with previous periods Controlled by adequate systems www. Auditor. Roles. org 65
VI. Performance-Based Decision Making – Includes Stakeholders, Elected Officials, Managers, and Employees VI. 1. Using performance information to support decision making VI. 2. Examples of decision making 66
VI. 1. Performance Information Used for Different Decisions Budgetary Decision making Manage & Improve Operations Accountability Reporting 67
VI. 1. Performance Information Supports Decisions to: © Assess and adjust program performance service levels, and resources, © improve existing programs and services, © improve internal management systems, © revise performance plans and reports, © initiate new programs and services, and © bottom-line—use performance information to support continuous improvement and 68 public accountability.
V. 1. Strategic Performance Budget Decision Model Target for Increased Funding Strategic Importance Strategic Success in Achieving Community or Program Outcomes Target for Funding Cuts HIGH Useful Contributor to Government Success LOW POOR Performance Results GOOD Use of Strategic Goals and Performance Results in Prince William County Budget Decisions www. Auditor. Roles. org 69
VI. 1. Strategic Performance Budget Decision Model: In Reality: Varies with Fiscal Environment Target for Increased Funding Strategic Importance HIGH Strategic Success in Achieving Community or Program Outcomes Less funding available Target for Funding Cuts Useful Contributor to Government Success More funding available LOW POOR Performance Results GOOD Use of Strategic Goals and Performance Results in Prince William County Budget Decisions www. Auditor. Roles. org 70
VI. 2. Output Effectiveness (Quantity) Recreation Centers 100. 00 80. 00 60. 00 40. 00 20. 00 10 PM Northwest Rosewood Givens South Austin 9 PM 8 PM 7 PM 6 PM 4 PM 5 PM HOURS 3 PM 2 PM 1 PM NOON 11 AM 10 AM 0. 00 9 AM AVERAGE NUMBER OF PARTICIPANTS PER HOUR 120. 00 71
VI. 2. Health Clinics (efficiency) Health Clinics - Adjusted total cost comparison among clinics for billable and non billable encounters combined. External Costs: City/County corporate overhead (payroll, human resources, treasury, controller’s office, etc. ) along with capital improvement projects, rent, depreciation, etc. FY 09 Average FY 10 Target Internal Costs: Physicians, nurses, medical supplies, pharmacy, etc. 72
VI. 2. Examples of Performance-Based Decision Making (Inputs & Mission) 140 3 100 2 80 1. 5 60 1 40 0. 5 20 Index Crime Per 1, 000 Population Tucson Portland Phoenix Nashville Seattle Oklahoma City Sacramento Charlotte Dallas Fort Worth Austin San Antonio Houston Las Vegas 0 El Paso 0 Officers Per 1, 000 Population 2. 5 San Diego Index Crime Per 1, 000 Population 120 Officers Per 1, 000 Population Index Crime and Number of Officers in Selected Cities 73
VII. Conclusion: Creating a Managing for Results Culture © Supporting our Vision. . . © Creating A Results Orientation: Services, Activities and Programs © Creating Accountability: Measures and Indicators © Creating Integration: Making it Happen at the Operational Level 74
VII. Lessons Learned © “Bottom-Up” approach neglected broad performance areas and alignment • Department key indicators © Accounting structure is a major hurdle © Definition of “services” not clear © Results orientation difficult when template not used effectively © Poor use of template = poor measures © The “not something I control” syndrome 75
VII. Where we are going… © Assessing and Improving the Reliability of Reported Measures • Data Collection Infrastructure • Certification Program © Providing Further Training • Using Information in Management • Using Information in Operations © Re-enforce Cultural Shift at the Operational Level © Passing and Implementing a “Best Practice” Performance Accountability Ordinance 76
VII. Performance Measurement and Accountability: Best Practices Checklist © Obtain active participation by top-level managers and decision makers © Create a clear vision of why and how performance measures will be used internally and externally © Understand the limits of performance measures what they can and cannot do © Sustain organizational commitment over a long period despite barriers and the potential for bad news. © Integrate the performance measurement and reporting system with organizational planning, service delivery, and decision making systems © Through planning align mission, goals, objectives/targets, and measures © Design goals and objectives/targets that specify a single aspect of performance © Design aggressive yet realistic goals and objectives/targets that encourage progress beyond past performance levels © Involve employees, customers, and stakeholders in developing goals, objectives/targets, and measures 77
VII. Performance Measurement and Accountability: Best Practices Checklist (Continued) © Identify all programs which will be measured and define them through an input process/activity output outcome model © Design a “family of measures” for each program which provides key information to support decisions © Periodically evaluate current performance measure; change when needed but try for comparability over time © Define each measure and identify data sources and data collection procedures © Produce performance information (including explanatory information) which is clear and useful to management, employees, customers, and stakeholders © Educate, encourage, and reward managers for using performance information to make decisions which improve program 78 management and service delivery
Training/Assistance to Get There ©EGAPP, Inc. provides training in all aspects of performance management and auditing. (Brochure Available) ©Auditor Roles Project provides training in assessing/auditing performance management systems and measures. Assistance can also be arranged. Email to: egappmorgan@yahoo. com 79
Thank You. ©More questions. ©More comments. ©Thank you, again. 80
98612d9df2ff0c941143446cce352bf6.ppt