Скачать презентацию The Requirement to Report Accurate Data ETA s Data Скачать презентацию The Requirement to Report Accurate Data ETA s Data

88a85870c04dee03cf96a57c63aca92a.ppt

  • Количество слайдов: 46

The Requirement to Report Accurate Data ETA’s Data Validation Policy for Employment and Training The Requirement to Report Accurate Data ETA’s Data Validation Policy for Employment and Training Programs

Overview • • Background ETA’s Data Validation Policy The Need for Quality Control Creating Overview • • Background ETA’s Data Validation Policy The Need for Quality Control Creating a Workable Data Management Strategy Best Practices WIA Validation Results Questions Web Sources for Additional Information

Background Poli c doesy alone no qualensure t ity d ata da n tio Background Poli c doesy alone no qualensure t ity d ata da n tio

Policy alone does not ensure quality data • Monitoring • Active program management • Policy alone does not ensure quality data • Monitoring • Active program management • Policies and Information Systems that support data integrity

Background • GPRA Section 1115(a)(6) requires Federal agency performance plans “describe the means to Background • GPRA Section 1115(a)(6) requires Federal agency performance plans “describe the means to be used to verify and validate measured values” • “WIA Performance Outcomes Reporting Oversight, ” OIG Report No. 06 -02 -006 -03390, September 2002

Background • “Data Validation Policy for Employment and Training Programs, ” TEGL No. 3 Background • “Data Validation Policy for Employment and Training Programs, ” TEGL No. 3 -03, Change 3, July 2005 • GAO Report to Congress “Labor and States Have Taken Actions to Improve Data Quality, but Additional Steps Are Needed, ” GAO-06 -82, November 2005

 ETA’s Data Validation Policy D V ETA’s Data Validation Policy D V

Validation “Why Should We Do It? ” • Data validation required by OIG and Validation “Why Should We Do It? ” • Data validation required by OIG and now being reviewed by GAO • Data validation is a key component in overall performance strategy • Program funding is being directly tied to reliable performance outcomes (performance budget integration) • Data validation is integrated into reporting

1 3 Once we have specifications, we have to create standardized software that applies 1 3 Once we have specifications, we have to create standardized software that applies edits to the participant files and performs all of the required calculations. We begin with policy regarding performance objectives. The starting point is generally federal legislation. What do we want to know? What information do we need? Performance Objectives 5 Specifications for Reports and Data Elements Standardized Software 2 The objectives must be translated into detailed specifications outlining the operational parameters for measurement. "Data element specifications" define the fields and values required to calculate the reports, including key dates, services received and wages earned after exit. "Report specifications" define how the fields and values are used to calculate outcomes. The report specifications define which participant records are included in each numerator and denominator. Summary reports and summary/analytical reports from grantees indicate the level of error in the performance outcomes and data elements. This is necessary both for feedback to states as well as to help interpret the outcomes. (Note that error levels have not yet been established. ) Validation of Calculations & Data Validation Reports 4 We have to ensure grantees are not just performing calculations in accordance with required specifications (report validation), but that the data elements used in the calculations are accurate as well (data element validation). If grantees use the standardized software to calculate their report, they do not need to perform report validation. Data element validation requires checking data against source documentation. Performance Outcomes 6 Performance outcomes conveying program 'success' – based on increasingly more valid data – are reported to stakeholders at various levels. The Data Reporting and Validation System (DRVS) is central to the reporting process and not just an 'after the fact' validation of reports.

ETA’s Data Validation Policy • Grantees are required to annually validate data submitted to ETA’s Data Validation Policy • Grantees are required to annually validate data submitted to ETA to ensure accuracy • Two types of validation required: – Report (required before submitting an annual report) – Data element (required within 120 days after submitting individual records) • Failure to validate reported data is deemed “failure to report”

ETA’s Data Validation Policy • Policy covers six ETA employment and training programs • ETA’s Data Validation Policy • Policy covers six ETA employment and training programs • Policy was rolled out in phases: – 1 st phase/ 2 nd phase (1 st & 2 nd years) –detecting and resolving issues with data and reporting systems, and compiling error rates analyzed to set accuracy standards for the 3 rd year (finished) – 3 rd phase (3 rd year) – accuracy standards are applied to data reported by grantees (being completed now) • Data not meeting accuracy standards will be considered unacceptable for measuring performance

ETA’s Data Validation Policy • Grantees will be held accountable for meeting accuracy standards ETA’s Data Validation Policy • Grantees will be held accountable for meeting accuracy standards – With technical assistance from ETA grantees should have developed and implemented corrective action plans • Significant or unresolved deviation from accuracy standards may be deemed “failure to report” • Grantees are encouraged to use ETAdeveloped handbooks, software, and guides to validate reported data • ETA will monitor State validation efforts

ETA’s Support of Effort • Validation tools are evolving to meet state needs • ETA’s Support of Effort • Validation tools are evolving to meet state needs • Policies are changing/clearer based on state responses • Technical support to State Staff – Regional office – National office – Mathematica Staff

State and Local Roles and Responsibilities • States had varying experiences in identifying validation State and Local Roles and Responsibilities • States had varying experiences in identifying validation assignments – Some states had no problem – Some states took time to sort through roles of different units – Some states still have not clarified assignments (particularly for TAA) • Organization of case files at local areas was often not standardized or adequate

States Experiences with Data Validation • States had to determine staff to be responsible States Experiences with Data Validation • States had to determine staff to be responsible for data validation • Communication of expectations and requirements to local areas

States Experiences with Data Validation • Mode of data element validation – onsite, centralized States Experiences with Data Validation • Mode of data element validation – onsite, centralized or both – Onsite validation is the recommended mode – Only Large states can ask Regional Staff for an exemption to perform centralized validation activities; the request must be made in writing to the Region office before the process is started.

Various Methods for Data Element Validation • Onsite validation is essential to preserve the Various Methods for Data Element Validation • Onsite validation is essential to preserve the integrity of the process • Ideal for state staff to perform validation onsite – Promotes communication and mutual understanding with locals

Various Methods for Data Element Validation • In some cases, onsite validation is impractical Various Methods for Data Element Validation • In some cases, onsite validation is impractical – Distances are too great – Small number of records • With the approval of the regional office, states can therefore pursue a combination of onsite and remote validation if necessary

Detail for Reduction in Elements Program # of Elements for PY 02 # of Detail for Reduction in Elements Program # of Elements for PY 02 # of Elements for PY 03 Adult 41 26 Dislocated Worker 44 30 Older Youth 48 35 Younger Youth 100 76 Trade 35 23 NFJP 17 14 Total 285 157

Schedule for Reporting of Validation Results • WIA RV was due October 1, 2005 Schedule for Reporting of Validation Results • WIA RV was due October 1, 2005 prior to submitting the annual report • WIA and TAA data element validation will be due February 1, 2006 • LX report validation was due August 15, 2005 with the report.

The Need for Quality Control The Need for Quality Control "OH, OH— THE QUALITY-CONTROL INSPECTORS ARE HERE. "

The Need for Quality Control • ETA’s policy seeks only to assess the usability The Need for Quality Control • ETA’s policy seeks only to assess the usability of reported data for federal purposes • Grantees, and not ETA, determine the activities to be undertaken to ensure valid data are reported to ETA – ETA to offer best practices/TA • Originally, few, if any, grantees had well articulated data management strategies – Grantees should have started to formulate strategy to improve data quality

The Need for Quality Control • Data management strategies for high performing organizations typically The Need for Quality Control • Data management strategies for high performing organizations typically addressed four factors: – Completeness – ensuring critical data elements contain needed information – Timeliness – ensuring data are collected and reported in a timely fashion – Validity – ensuring the reported information can be substantiated or confirmed – Reliability – ensuring the reported data are trustworthy for decision-making purposes • Understanding the data flow is key to creating a workable data management strategy

The Need for Quality Control Typical Data Flow Source Collect Data Quality Entry Checks The Need for Quality Control Typical Data Flow Source Collect Data Quality Entry Checks ETA Required Report Validation Report ETA Required Data Element Validation Individual Records

The Need for Quality Control Reviewing Data Quality State Policies and Procedures MIS Manuals The Need for Quality Control Reviewing Data Quality State Policies and Procedures MIS Manuals Systems Monitoring Training LWIA

The Need for Quality Control Cycle of Quality Management Federal issuance of policy and The Need for Quality Control Cycle of Quality Management Federal issuance of policy and guidance Federal monitoring State review of policy and training Issuance of guidance State review and monitoring of data Local review and monitoring Training of Local level staff Initial data entry

Creating a Workable Data Management Strategy • Data collection and data entry… – Grantees Creating a Workable Data Management Strategy • Data collection and data entry… – Grantees should develop guidance for staff and sub-grantees involved in the collection of data § § Definitions of data elements Sources of information Participant record and documentation requirements Procedures for collecting, entering and reporting data and associated “business rules” that cover timeliness and completeness § Procedures for entering data into an automated database § Procedures for correcting data

Creating a Workable Data Management Strategy • Data collection and data entry… – Grantees Creating a Workable Data Management Strategy • Data collection and data entry… – Grantees should provide routine training on the data management guidance – Grantees should require all persons involved in the collection or entry of data be trained in the procedures – The data entry process should include steps for verifying entered data against original sources on a sample basis or for entire population of records

Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… – Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… – Sometimes data are correctly transcribed from forms, but are incorrect • Recording a date of February 29, 2003, when there are only 28 days in the month • Recording a date of exit that occurs before the date of registration for the customer – Range check - reviewing recorded data to ensure legitimate data are captured – Logic check – reviewing recorded data to see if it makes sense – Edits are typically programmed into data entry screen applications – Logic checks typically involve ad hoc queries and manual file reviews

Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… – Creating a Workable Data Management Strategy • Grantees should conduct periodic quality checks… – Grantees should evaluate data collection efforts by randomly observing interviews and reviewing other data collection methods to ensure procedures and instructions are followed properly – Grantees should assess the accuracy of data by verifying entered data against original sources on a sample basis or for the entire population of records – Grantees should link the data quality review process to management actions for continuous improvement

Best Practices Best Practices

Best Practices • Creating unified record system for your local area – Indexing records Best Practices • Creating unified record system for your local area – Indexing records – Training – Using State MIS/Policy Manuals • Using labels/tables to reflect data elements • Ensure secondary quality checks are in place

Best Practices • NC File System – Cohorts: • • A DW OY YY Best Practices • NC File System – Cohorts: • • A DW OY YY Adult Dislocated Workers Older Youth Younger Youth – Categories: • E • A • XP Eligibility/Intake/Application Employment Activities Exit and Post-Program Activities

Best Practices • Adult’s date of birth – AE 2 – A = adult Best Practices • Adult’s date of birth – AE 2 – A = adult – E = eligibility/intake/application – 2 = the U. S. DOL’s reference number for date of birth • Copy of a birth certificate in the file is labeled AE 2

Best Practices Field Ref # Num Data Element Federal and State Sources Match 1 Best Practices Field Ref # Num Data Element Federal and State Sources Match 1 AE 2 102 Date Of Birth Baptismal Record Birth Certificate DD-214 Match AE 3 104 Individual With A Disability 1 = Yes 2 = Yes And Disability Results In A Substantial Impediment To Employment 3 = No Case/Activity Notes Physician’s Statement Psychiatrist’s Statement Support 1 Matches or supports the data element being validated

Best Practices • Integrated MIS Systems – Support validation and reporting – TN and Best Practices • Integrated MIS Systems – Support validation and reporting – TN and KY are using integrated common measures -based systems

Best Practices • KY Local Area compares EKOS entry screen to actual file to Best Practices • KY Local Area compares EKOS entry screen to actual file to look for errors.

Best Practices • SCVOS – has popup windows with the types of documentation allowed Best Practices • SCVOS – has popup windows with the types of documentation allowed for each field were verification is required

Summary of Best Practices • Data Quality and Validation – Continuous process – Increases Summary of Best Practices • Data Quality and Validation – Continuous process – Increases performance – Insures compliance

Validation Results Validation Results

WIA Validation • Some States had poor results compared to the rest of the WIA Validation • Some States had poor results compared to the rest of the Nation – This is GOOD news! • What Is The Good News? – States followed the data validation instructions and did not skew their results with false passes! – Can actually use validation to improve quality

Key Areas to Focus on for QC • WIA Registration Date • Date of Key Areas to Focus on for QC • WIA Registration Date • Date of Exit • Documentation Activity Dates • Wages – Screenshots to archive • Vets – DD 214

Questions Open Discussion Questions Open Discussion

Review • • Background ETA’s Data Validation Policy The Need for Quality Control Creating Review • • Background ETA’s Data Validation Policy The Need for Quality Control Creating a Workable Data Management Strategy • Best Practices • WIA Validation Results • Web Sources for Additional Information

For More Information… ETA’s Performance and Results Website http: //www. doleta. gov/Performance/ ETA’s Data For More Information… ETA’s Performance and Results Website http: //www. doleta. gov/Performance/ ETA’s Data Validation Handbooks and Software http: //www. doleta. gov/Performance/reporting/tool s_datavalidation. cfm

Contact Information Traci Di. Martini Dimartini. Traci@dol. gov 202. 693. 3698 U. S. Department Contact Information Traci Di. Martini Dimartini. [email protected] gov 202. 693. 3698 U. S. Department of Labor Employment and Training Administration Office of Performance and Technology 200 Constitution Ave. NW Washington, DC 20210