46e8a3401daba211c5dcbd10abe6e78f.ppt
- Количество слайдов: 64
Benchmarking CDI: A Study of 37 Academic Medical Centers Kathy Vermoch, MPH Project Manager, Quality Operations University Health. System Consortium vermoch@uhc. edu Donald A. Butler, RN, BSN Manager, Clinical Documentation Improvement Pitt County Memorial Hospital, Greenville, NC dbutler@pcmh. com
Session Objectives Receive highlights from the UHC Clinical Documentation Improvement (CDI) Benchmarking Project Learn about successful approaches used by academic medical centers (AMCs) to promote the accuracy of clinical documentation and coded data Discover how one AMC used project data to identify and address documentation and coding improvement opportunities
What Is UHC? • The University Health. System Consortium (UHC), Oak Brook, Illinois (1984) is an alliance of: – 112 academic medical centers – 256 affiliated hospitals – 80 faculty practice groups • Mission: To advance knowledge, foster collaboration, and promote change to help members succeed in their respective markets • Vision: To be a catalyst for change, accelerating the achievement of clinical and operational excellence
UHC Members in Florida • Cleveland Clinic Hospital (Weston) • Jackson Health System (Jackson Memorial Hospital) – Jackson North Medical Center – Jackson South Community Hospital • Mayo Clinic Jacksonville • Shands Health. Care (Shands at the University of Florida) – – – Shands Jacksonville Shands Lake Shore Shands Live Oak Shands Starke University of Florida College of Medicine, Faculty Group Practice • Tampa General Hospital • University of Miami Medical Group • University of South Florida Physicians Group Boldface: AMC members – Plain text: Associate members – Italics: Faculty practice groups
CDI Project Goals • The steering committee set the following project goals: – Evaluate clinical documentation practices – Identify critical success factors for documentation quality – Learn about CDI performance measures used by AMCs – Communicate successful CDI strategies
Project Participation and Data • In 2009, 37 AMCs completed a CDI survey • UHC Clinical Data Base (CDB) data were also used to: – Evaluate reporting of comorbidities and complications (CCs and major CCs) – Assess documentation quality – Evaluate data quality • Present on admission (POA) reporting • 41 innovative strategy reports submitted • 4 better-performers interviewed • In 2010, the UHC Clinical Documentation Improvement Collaborative was conducted
Benchmarking Project Participants • • • • • • UC Davis Medical Center • University Hospital of the SUNY Upstate Medical University • • • University Hospitals Case Med. Ctr. • • • University of Kentucky Hospital • • University of North Carolina Hospitals University of Toledo Medical Center The Ohio State University Med. Ctr. • • • The University of Michigan Hospitals and Health Centers • Wishard Health Services Denver Health Hennepin County Medical Center IU Health (Clarian) MCG Health, Inc. Medical Univ. of South Carolina Mount Sinai Medical Center NYU Langone Medical Center New York-Presbyterian Hospital North Carolina Baptist Hospital (Wake Forest) Northwestern Memorial Hospital Olive View-UCLA Medical Center Oregon Health & Science University Medical Center Penn State M. S. Hershey Medical Center Shands at the University of FL Stanford Hospital and Clinics Stony Brook Univ. Medical Center SUNY Downstate Medical Center/University Hospital Tampa General Hospital The Methodist Hospital System UC Irvine Medical Center University Health Systems of Eastern Carolina (PCMH) University Medical Center University of Arkansas for Medical Sciences (UAMS) Medical Center University of Maryland Medical Center University of Missouri Health Care (University Hospital) University of Pennsylvania Health System (Hosp. of the Univ. of Penn. ) Vanderbilt University Medical Center Virginia Commonwealth University Health System (VCUHS)
Project Highlights
CDI Program Profile • 44% of participants are part of multihospital systems – Of these, 50% have implemented standardized clinical documentation assurance practices across the system • 32 of 37 (86%) have a formal CDI program; of these: – 59% in place > 3 years, 31% > 5 years – 59% of the programs focus on Medicare patients
Oversight of CDI Programs Department Responsible for the CDI Program
CDI Staffing • Clinical documentation specialists are nurses (69%), coders (25%), and mixed (6%) • Dedicated nonsupervisory CDS FTEs = 5. 4 mean (range: 1. 0 – 11. 5) – Mean, annual inpatient discharges per CDS FTE = 7, 914 (range: 2, 103– 21, 709) • Dedicated supervisory CDS FTEs = 0. 9 mean (range: 0. 0– 2. 0) • 28% have paid, dedicated CDI physician FTEs with a mean of 0. 5 MD FTEs (range: 0. 3– 1. 0) • All/most CDI staff are hospital employees for 100% of respondents
CC and MCC Capture Rate Report Benchmarks CC and MCC capture by service line across UHC’s AMC members and suggests possible revenue opportunities
Millions in Revenue May Be Missed • Median $3. 4 million Medicare revenue opportunity if reporting CCs and MCCs at top quartile of UHC’s AMC members: – Range $462, 000 to $21. 1 million – Average $4. 3 million • Medicare revenue opportunity doubled after MS-DRGs implemented Source: UHC Clinical and Financial Data Base for discharges from 89 AMC members
Data Quality • CDB data analysis shows 15% of project participants have POA data quality issues: – POA indicator uncertain for > 10% of records – Nonexempt POA specified as exempt for > 5% of records – POA indicator not reported 97% responded on survey that POA is coded for all payers and patients
Coding Profile Report CDB report that benchmarks key data elements for possible documentation and coding opportunities
Data Analysis • Medical CMI was < 1. 0 for 7. 4% of AMC hospitals • MS-DRG 313 (chest pain) was in the top 20 MS-DRGs based on case volume for 55% of hospitals • More than 5% of cases in MS-DRGs w/o CC/MCC had LOS >= 95 th percentile in 86% of hospitals • Admit source was missing/unknown for > 2% of cases in 1% of hospitals (range: 0– 20%) • Admit status was missing/unknown for > 1% of cases in 5% of hospitals (range: 0– 8. 5%) Source: UHC CDB data, CY 2009, 3 million discharges from 95 AMCs
Data Analysis • Race was missing/unknown for > 2% of cases in 2% of hospitals (range: 0– 88%) • More than 0. 5% of AMI cases were coded 410. X 0 in 41% of hospitals (range: 0– 5%) – More than 1% of AMI cases were coded 410. X 0 in 26% of hospitals • More than 0. 5% of cases with mechanical vent were coded 96. 70 in 9. 5% of hospitals (range: 0– 2%) • Percentage of cases with any AHRQ comorbidity reported ranged from 45% to 89% Source: UHC CDB data, CY 2009, 3 million discharges from 95 AMCs
CDI Bundled Score • A formal clinical documentation improvement program has been implemented • Documentation clarification query response rates are tracked • Strong likelihood of querying providers for documentation clarification for: – Patients with an extended LOS and a SOI value of < 3 – The unexpected death of patients with a ROM value of < 3 or 4 – Inadequate documentation of the patient’s history, demographics, or admission, discharge, and transfer status – Inadequate documentation of medical necessity
Self-Assessment Score Self-assessment statements % agree Comorbidities and complications are thoroughly documented Providers have comprehensive understanding of documentation issues Clinical documentation accurately demonstrates medical necessity Morbidity and mortality data accurately reflect acuity and complexity Providers are held accountable for documentation quality and timeliness Organization does a good job using technology to improve documentation Documentation supports accurate coding of HACs Documentation supports accurate POA reporting Admission, transfer, and discharge information are well documented Maximum achievable score = 1. 00; Mean = 0. 29; Range = -0. 41 to 0. 94 18 24 27 30 43 46 46 49 49
Key Performance Measures Target All (mean) BP (mean) Range # service lines reporting CCs or MCCs at the top quartile (Target: Best quartile) 6 4 8 0 to 9 # service lines reporting MCCs as a percentage of cases with CCs and MCCs at the top quartile (Target: Best quartile) 6 4 8 0 to 11 Pass 86% 100% Pass or fail 3. 0 1. 4 3. 0 1. 0 to 3. 0 6 3 4 1 to 6 1. 00 0. 29 0. 69 -0. 41 to 0. 94 Measure CDB data quality (%) (Target: pass) Coding profile score (Target: 3. 0) Effective clinical documentation practices bundled score (Target: 6 practices) Clinical documentation practices self-assessment score (Target: 1. 00) Source: UHC Clinical Data Base analysis and survey results
Better-Performing Organizations • • IU Health (IN) The Methodist Hospital System (TX) University Medical Center (AZ) Virginia Commonwealth University Health System (MCV Hospital)
Clinical Documentation Critical Success Factors • Provide strong medical and administrative • • leadership support Communicate and collaborate Perform robust data tracking and mining Offer ongoing education Implement an effective clinical documentation improvement program
Strong Medical and Administrative Leadership Support • Resources: Invest in CDI FTEs, training, hardware and software, tools and databases, consultants, auditors, etc. • Physician leaders: Gain buy-in, provide peer-to-peer education, and offer documentation clarification • Accountability: Hold providers responsible for compliance • Informed decision-making: Use high-quality data for planning and performance improvement
Communicate and Collaborate • Coders and CDS staff meet often for education, data analysis, and clarification • CDS have open, consistent communications with doctors • Physicians take part in designing and implementing documentation improvement initiatives • CDI communicates and collaborates with all areas that use coded data
Perform Robust Data Tracking and Mining • Track query response rates and share data with medical leaders to hold providers accountable for compliance • Analyze query data for educational opportunities • Analyze coded, clinical data to benchmark performance and surface improvement opportunities
Offer Ongoing Education • Share timely, service-specific examples of the impact of documentation and coding • Provide easy, convenient point-of-care tools • Do not over-emphasize reimbursement revenue issues with doctors • Use clear, specific language and examples
Implement an Effective CDI Program • Hire/train expert CDI staff to conduct concurrent chart reviews • Educate doctors about the impact on quality data • Implement practices that support documentation improvement • Hold providers accountable for compliance with requirements • Benchmark performance and communicate data
Performance Opportunity Summary Report card of each organization’s performance on key performance measures
Self-Assessment Scorecard Questions derived from project findings and better-performer interviews
Clinical Documentation Improvement Collaborative • Benchmarking project findings and resources were communicated • 34 organizations completed a gap analysis • • Participants selected their teams Collaborative workgroups formed Initiatives launched locally Monthly networking conference calls for 6 months • Web conferences conducted
Collaborative Participants • • • • • Fletcher Allen Health Care Hennepin County Medical Center IU Health New York-Presbyterian Health System Northwestern Memorial Hospital NYU Langone Medical Center Oregon Health & Science University Parkland Health & Hospital System Penn State M. S. Hershey Medical Center Riverside County Regional Medical Center Shands at the University of Florida Shands Jacksonville Stanford Hospital & Clinics Stony Brook University Medical Center SUNY Downstate Medical Center/University Hospital The Ohio State University Medical Center Thomas Jefferson University Hospital UC Davis Medical Center University Health Systems of Eastern Carolina (Pitt County Memorial Hospital) • • • • University Hospitals Case Medical Center University of Arkansas for Medical Sciences (UAMS) Medical Center University of Kentucky Hospital University of Michigan Hospitals & Health Centers University of Mississippi Health Care University of Missouri Health Care (University Hospital) University of Pennsylvania Health System (Hospital of the University of Pennsylvania) University of Rochester Medical Center University of Texas Medical Branch, Galveston University of Washington Medical Center UNM Hospitals Upstate University Hospital Virginia Commonwealth University Health System (VCUHS) Wishard Health Services
Examples of Collaborative Achievements • UC Davis: Improved concurrent query response rate from 46% • • • to 76% UH Case: Improved time for initial CDS chart review (Goal: Working DRG within 24 hours) Stony Brook: Implemented a mortality review process for deaths with “less than extreme ROM” TJUH: Enhanced retrospective query procedures, improved response rates from 82% to 100% OHSU: Revised query policies and templates, developed a training program for new CDI staff Stanford: Expanded mortality reviews and achieved a 7% increase in extreme ROM cases Shands UF: Developed a CDI mission statement and selected CDI performance measures
Using Data to Create Change The success of benchmarking comes from implementation, not the data • Next, Don Butler from the University Health Systems of Eastern Carolina (Pitt County Memorial Hospital) – To demonstrate how his CDI team acted on project data to validate and address documentation and coding improvement opportunities
Drilling for Answers: CT Surgery Documentation Improvement Project
Overview • Present a detailed overview of all phases of this project as an example of data-driven performance improvement – Trigger report – Team formation & analysis – Findings & planning – Actions – Results – Next?
The Trigger: UHC Report Sparks Interest What about these? 37
The Trigger: UHC Report Sparks Interest • Several areas of potential interest – CT surgery & vascular • Cardiology & gen surgery don’t seem to have the same likely initial opportunity 38
The Trigger: UHC Report Sparks Interest 39
Defining the Opportunities: Select a Team & Conversations • Members of the cardiovascular measurement team and the clinical documentation improvement team were tasked to explore UHC report • Initial conversations focused on: – Possible statistical reasons • Ratio valves vs. CABG • Ratio with vs. without cath – Impression: question will be to quantify (not validate need for improvement) • Reviewed & acknowledged improvements obtained with CT surgery consult/H&P form • Perform further analysis: – Data analysis (UHC) – Focused chart review (based on STS database) – CDI experiences 40
Defining the Opportunities: First Cut: Procedure Grouping Focus • Largest volume CABG & valves – 77% of all cases – Lessons learned ought to translate across to other procedural groups 41
Defining the Opportunities: Second Cut: 2 Procedure Focus • Red dot indicates ECHI performance during UHC study; graphed are data Oct. to Feb. ’ 08 • Note real improvements in capture of CC/MCC with valve cases 42
Defining the Opportunities: Third Cut: CABG & Valves vs. NC & U. S. • U. S. and NC, significant difference with ECHI • However, U. S. and NC are close 43
Defining the Opportunities: Validate: APR DRG SOI • Similar correlations when looking at SOI measures (for 2 NC centers A & B) 44
Mortality Index – NC Comparison A B C
Defining the Opportunities: Prevalence CABG MCCs Across Institutions • Selected 2 NC peers and drilled down to ICD-9 level (CABGs) • 4 most prevalent MCC diagnoses: REAL difference on frequency of capture • Caveat: NSTEMI likely mix PDX & ODX 46
Defining the Opportunities: Clinical Review • Case selection – All reviewed cases were from the provided STS database for cardiothoracic surgical cases – A total of 129 cases were reviewed from a 5 -month period identical to analysis above – Selection was based on the presence of STS data elements AND those cases that were without CC or MCC: • There was not as strong of a correlation as might have been expected between the STS data and the documentation and clinical indicators seen in the chart, specifically for HF and renal failure • Cannot extrapolate results to all CABG & valve cases due to selection criteria AMI 1– 7 days; HF; renal failure; prolonged vent; asthma/COPD; confusion; ptx/hemothorax; op/reop bleed/tamponade; 8 additional elements with infrequent occurrence 47
Defining the Opportunities: Clinical Review Findings • Clinical indicators were identified to support an MD query: – 12% of cases for CC/MCC – >7% for additional severity diagnosis – Again, cannot extrapolate 50% $126, 940. 20 Possible additional revenue 75% $190, 410. 30 @ a varied confidence level for obtaining diagnosis by MD query 100% $253, 880. 40 48
Defining the Opportunities: Bonus: Physician Rates UHC Mean 49
Defining the Opportunities: MD (& PA/NP) CDI Behaviors • MD response with CDI activities (FY 08) • CDI concurrent reviews FY 08: – 1335 CT surgery service line cases – 84% reviewed – 8% of reviews achieved confirmed improvements • Physician advisor reviews (new program): – Reviewed 103 cases (screened selection of cases) – 19% of cases (20) identified for a recommended query 50
Additional Analysis Considered • Physician outliers: – Patient population variance – Practice/model patterns • Complication outliers: – Explore clinical validity from physician perspective – Review post-procedure documentation patterns for completeness • Consider additional physician expert review (consulting services) • After discussion & consideration, additional formal analysis was not initiated 51
Actions Taken • Informal conversations: physician & executive leadership (both clinical and administrative) – Current documentation patterns and benefits of enhanced specific diagnosis identification • Formal presentations: – Attending presentation – NP & PA – Reemphasize program process, rationale & benefits with physicians and providers – Focus and discuss specific clinical conditions & phrases • Review CT surgery consult/H&P form 52
Actions Taken • CDI program prep & planning: – Sharpen strategies for effective concurrent reviews • Education, process, prioritization – Specific physician (and NP/PA) partnerships with individual CDS – Routine communication expectations – Preferred, customized avenues & styles of communication • 100% consultant physician advisor review – Incorporate learning & examples to CDS & coding teams – Ongoing final “check” 53
Actions Taken • CDI program execution: – Clinical subject internal education & learning – Include ALL CT surgery patients – Increase CDS staffing • From 2 to 3 covering CT, card & vasc pts • Adjusted coverage plan, collaboration & flows – Daily 100% chart review – Ensure at least biweekly direct conversations – Plan 6– 12 months focused effort • Tools & process • Close monitoring & support 54
Actions Considered • Develop standard and agreed definitions or descriptions of clinical conditions that meet certain ICD-9 diagnosis • Focus on “smart” tools to ensure capture of post -procedural events & diagnosis (similar to the CT consult/H&P form) 55
Results • Mortality index • MCC capture • ICD-9 specific frequencies • UHC – a moving target
Mortality Index (CABG & Valves)
MCC Capture (CABG & Valves)
ICD-9 Frequencies (CABG & Valves) Combined Valve & CABG MCC Rates Baseline FY 2010 Acute HF 15% 23% ARF/ATN 20% 19% 4295 - chordae tendineae rupt 8% 15% 2866 - defibrination syndrome 0% 13% 5185 - posttr pulmon insuff 7% 13% 5856 - esrd 11% 12% AMI 10% 5% 42741 - ventricular fibrillation 11% 5% 51881 - ac respiratory failure 5% 5% 78551 - cardiogenic shock 1% 3% 486 - pneumonia organism nos 6% 3% 4210 - ac/subac bact endocard 2% 2% 5184 - acute lung edema nos 0% 2% 105% 126% • ARF moved toward RIFLE criteria, which presented some fewer opportunities • Improved reception & adoption of coding- required vocabulary • Consumptive (pump- related) coagulopathy • Key – Yellow improved – Turquoise – clinical care improvement?
UHC Data – A Moving Target • Comparative performance is also improving, so “success” targets must continuously be adjusted and evaluated • Note: More recent report only halfway through FY 2010
New Efforts & Opportunities • Data diving into the UHC mortality model (for example) – Stroke care; orthopedics; hospitalist/general medicine; medical cardiology; surgery; trauma; cardiology • Repeat focused efforts with another identified service line (along with mortality work) • Starting to hear physician concern & interest with healthcare changes • Physician executive with CDI focus
Pitt County Memorial Hospital • 860 -bed tertiary care hospital – Part of University Health Systems with PCMH and 7 additional community hospitals (6– 180 beds) – Approximately 39, 000 discharges per year @ PCMH • Serving the 29 counties of rural eastern North Carolina (essentially all of NC 20 miles east of Raleigh) – Other community hospitals up to approximately 200 beds are found in the region • Only Trauma 1 center in the region • Major cardiovascular center • Rehab, oncology, NICU/L&D, bariatrics, Gamma Knife, robotic surgery, CVA, renal, orthopedics, pediatric Specialty … • Brody School of Medicine, East Carolina University
Pitt County Memorial Hospital: CDI Program Profile • 9 dedicated CDS FTEs (plus • • manager); RNs but open to coding professional Report to finance through HIMS – Very close collaboration with coding Started March 2006 with consultant Adding a 0. 5 FTE medical director in addition to developing physician advisor roles Somewhat a hybrid medical record (expect >95% EMR by end of year) Introduced electronic queries FY 2010 Reviews focus on acute inpatient Medicare 145 discharges reviewed per CDS with 2– 3 rereviews per case • Primary tools: – Consultant software (transitioning) – Software for work flow, coordination, prioritization & daily activity management – 3 M stand-alone encoder • Unit/service line focus for each • CDS Core Measures: – 94% review; 24% query; 87% physician response; 81% agreement; impact achieved with 7. 4% of reviewed cases • Additional roles: – RAC support; physician education; collaborative work with various departments or projects …
Thank you • Questions?
46e8a3401daba211c5dcbd10abe6e78f.ppt