
52da39a9d4756ff3b32164eee603e935.ppt
- Количество слайдов: 46
Evaluating Health Information Technology: A Primer Eric Poon, MD MPH Clinical Informatics Research and Development, Partners Information Systems Davis Bu, MD MA Center for Information Technology Leadership, Partners Information Systems AHRQ National Resource Center for Health Information Technology 1
Pre-Conference Logistics n To Access Slides: n n Login with username and password n Follow the links to download slides n n Go to http: //extranet. ahrq. gov/rc Problems? Email Resource. Center@norc. org Q&A Session at the End n n n Dial *1 to ask a question Please pick up handset (not speakerphone) Note that this teleconference is being recorded 2
Outline n Why evaluate? n General Approach to Evaluation n Deciding what to Measure n Study Design Types n Analytical issues in HIT evaluations n Some practical advice on specific evaluation techniques 3
Why Measure Impact of HIT? n Impact of HIT often hard to predict n n Understand how to clear barriers to effective implementation n n Understand what works and what doesn’t Justify enormous investments n n Many “slam dunks” go awry Return on investment Allow other institutions to make tradeoffs intelligently Use results to win over late adopters You can’t manage/improve what isn’t measured Good publicity for organization 4
General Approach to Evaluating HIT n Understand your intervention n Select meaningful measures n Pick the study design n Validate data collection methods n Data analysis 5
Getting Started: Get to know your intervention n Clarify the question: What problem does it address? n Think about intermediate processes n Identify potential barriers to successful implementation n Identify potential managerial and behavioral process to overcome implementation barriers 6
Array of Measures n Quality and Safety n n n Knowledge n n n Clinical Outcomes Clinical Processes n Resource utilization n Costs and charges LOS Employee time/workflow Patient knowledge Provider knowledge Satisfaction n n Patient satisfaction Provider satisfaction 7
Introducing the Evaluation Toolkit n Rough guides on general approach, costs and potential pitfalls n Major domains: n n n n Patient Knowledge & Attitudes Workflow Impact Financial Impact Measure Characteristics: n n Clinical Outcomes Clinical Process Provider Adoption & Attitudes IOM Domain Data Source Relative Cost n n Potential Pitfalls General Notes Would love to hear your feedback 8
Selecting Evaluation Measures for HIT: Three Examples 9
Computerized Provider Order Entry (CPOE) Example n Clarify the primary question: n n Does CPOE improve quality of care? Competing questions: Does CPOE save money? n What are the barriers to physician acceptance? n Does CPOE introduce new errors? n 10
CPOE: How can it affect quality? n Think about intermediate processes n n n Patient data is presented to ordering physician ADE alerts may be triggered and presented at the point of care (which alerts? ) Guideline reminders may be triggered an presented at the point of care (which guidelines? ) Medication order is entered Medication order is executed by pharmacy Medication order is executed by nursing staff 11
Does CPOE Improve Quality of Care? n Identify measures Process Measure Data Presentation (Redundant test ordering) ADE Alert frequency, ADE frequency Guideline Reminder Guideline compliance, clinical outcome Order Entry Ordering errors Pharmacy (Time to process order) Administration Time to administration 12
Evaluating CPOE’s Impact on Quality n Select Appropriate Methodology Does existing data exist that can be leveraged? (e. g. ongoing QA activities) n Does concurrent control exist? n How will the data be analyzed? n 13
Electronic Medical Records (EMR) Example n Clarify the primary question: n n What are the barriers and facilitators to effective EMR implementation? Competing questions: Do EMRs save money? n Do EMRs improve quality of care? n Do EMRs introduce new errors? n 14
EMR: Dissecting the EMR Implementation Process n Identify stakeholders n n Catalogue stakeholder interests and values n n Users of system, clinical leaders, administrative leaders Clarify impact of Implementation on clinical processes n n Workflow efficiency Clarify stakeholder role in implementation n n Providers, et al. User interface optimization, workflow re-engineering Define implementation success criteria n Provider buy-in, provider use and acceptance 15
EMR: Understanding the Barriers and Facilitators to Implementation n Identify measures Process Measure Stakeholder attitudes n Workflow n Process improvements n Success Criteria n Attitude/Satisfaction surveys n Readiness Assessment n Staff Turnover Efficiency metrics Staffing levels n Patient flow n Practice productivity n Implementation participation from staff Usage data n Training attendance n Measures listed above 16
EMR: Understanding the Barriers and Facilitator to Implementation n Select Appropriate Methodology Combination of quantitative and qualitative studies n Example: efficiency measures: n n Time motion studies: how did the system affect provider efficiency? Attitude Surveys: How did the system affect provider perception of efficiency? Semi-structured interviews: How did the implementation affect stakeholder workflow? Did that effect change over time and why? 17
Local Health Information Infrastructure (Laboratory) n Clarify the primary question n n Can LHIIs for labs generate a positive ROI? Competing questions: Can LHIIs for labs improve quality of care? n Which architecture is best suited for LHIIs for labs? n How do LHIIs for labs affect provider and patient perception of the health care system? n 18
LHII (Laboratory): Defining the ROI n Specify intermediate processes n Data is pulled from local laboratories n n n n n Previous labs pulled Lab order entered Lab order transmitted Administrative handling Lab results reported Lab results recorded Data is pulled from primary provider Authorization and payment is coordinated with payer Implementation of LHIO 19
LHII (Laboratory): Defining the ROI n Identify associated measures Process Measure Provider requests data Volume of requests Data is pulled Chart pulls, time for chart pulls, administrative costs Provider interprets data Amount of missing information Provider Orders test Volume of redundant tests 20
LHII (Laboratory): Evaluating the ROI n Select Appropriate Methodology Does concurrent control exist? n Are there ongoing trends over time? n How will the data be analyzed? n 21
Selecting Outcome Measures: General Comments n Generally want to pick 1 -3 outcomes of primary interest n n Outcome must be sufficiently frequent to be detectable n n Rare events such as adverse events due to errors particularly challenging Important enough to provoke interest n n n If choose more, need to make correction (e. g. Bonferroni) Whether study is positive or negative How would the results change policy (local or national)? Process vs. outcome n Legitimate to measure process n n Outcome often takes too long In many situations link between process, outcome clear 22
Study Types n Commonly used study types: Before-and-after time series Trials n Randomized Controlled Trials n Factorial Design n n Study design often influenced by implementation plan 23
Time Series vs. Randomized Controlled Trials n Before-and-after trial common in informatics n n n Off-On-Off trial design possible n n Concurrent randomization is hard Don’t lose the opportunity to collect baseline data! But may not be politically/ethically acceptable to turn off a highly used feature RCT preferable if feasible n n Eliminates the issue of secular trend Balance of baseline confounding 24
Randomization Considerations n n Justifiable to have a control arm as long as benefit not already demonstrated (usual care) Want to choose a truly random variable n n n Minimal number of arms n n Not day of the week Legitimate to stratify on baseline variables (e. g. education for pt, computer experience for providers) More arms, less power Strongest possible intervention 25
Unit of Randomization n Patients n Physicians n Practices/wards 26
Randomization Unit: How to Decide? n Small units (patients) vs. Large units (practices wards) n n n Contamination across randomization units If risk of contamination is significant, consider larger units Effect contamination-can underestimate impact n n However, if you see a difference, impact is present Randomization by patient generally undesirable n n Contamination Ethical concern 27
Randomization Schemes: Simple RCT Baseline Period Intervention arm Intervention Deployed 3 month burnin period XX Clinics Control No Intervention arm Baseline Data Collection n Post. Intervention Period Control arm gets intervention Data Collection for RCT Burn-in period n n Give target population time to get used to new intervention Data not used in final analysis 28
Randomization schemes: Factorial Design n May be used to concurrently evaluate more than one intervention: n Assess interventions independently and in combination n A B A+B Loss of statistical power n Control (no interventions) Usually not practical for more than 2 interventions 29
Randomization Schemes: Staggered Deployment n Advantages n n n Easier for user education and training Can fix IT problems up front Need to account for secular trend n Time variable in regression analysis 30
Randomization Schemes: Multiple Interventions 4 Interventions involving patient’s use of shared online medical records: • Medication Tracking • Diabetes Care • Prev. Care Reminders • Family History n n n 6 clinics 12 clinics Medication Tracking; Diabetes Care Arm 1 18 mo Randomize 6 clinics Control for Arm 2 Prev Care Reminders; Family History Arm 2 Control for Arm 1 Time efficient design Every clinic gets something. (Keeps clinics and IRB happy) Watch out for cross-arm intervention contamination 31
Inherent Limitations of RCTs in Informatics n Blinding is seldom possible n Effect on documentation vs. clinical action n People always question generalizability Success is highly implementation independent n Efficacy-effectiveness gap: ‘Invented here’ effect n 32
Data Collection n Electronic data abstraction n Convenient and time-saving, but… n Some chart review (selected) to get information not available electronically n Get ready for nasty surprises n Pilot your data collection protocol early n And then pilot some more… 33
Data Collection Issue: Baseline Differences n Randomization schemes often lead to imbalance between intervention and control arms: n n Need to collect baseline data and adjust for baseline differences Interaction term ( Time * Allocation Arm) gives effect for intervention in regression analysis 34
Data Collection Issue: Completeness of Followup n The higher the better: Over 90% n 80 -90% n Less than 80% n n Intention to treat analysis n In an RCT, should analyze outcomes according to the original randomization assignment 35
A Common Analytical Issue The Clustering Effect n Occurs when your observations are not independent: n Example: Each physician treats multiple patients: Intervention Group Control Group Physicians Patient -> Outcome assessed 36
Options for Dealing with the Clustering Effect n n Analyze at the level of clinician n Example: Analyze % of MD’s patients in compliance with guideline, and make MD unit of analysis n Huge drop in statistical power. n Not recommended. Generalized Estimating Equations n PROC GENMOD in SAS, or PROC RLOGIST in SUDAAN n Allows you to randomize at one level (e. g. physician) and then do analysis at another (e. g. patient) n Accounts for correlation of behaviors within a single physician (i. e. adjusts for the fact that observations across patients are NOT independent) 37
A Word About Surveys n Survey of user believes, attitude and behaviors n n Response rate – responder bias: Aim for response rate > 50 -60% Keep the survey concise Pilot survey for readability and clarity Need formal validation if you want plan to develop a scale 38
Looking at Usage Data n Great way to tell how well the intervention is going n n Target your trouble-shooting efforts In terms of evaluating HIT: Correlate usage to implementation/training strategy n Correlate usage to stakeholder characteristics n Correlate usage to improved outcome n 39
Studies on Workflow and Usability n How to make observations? n n Direct observations Stimulated observations n n Random paging method Subjects must be motivated and cooperative Usability Lab What to look for? n Time to accomplish specific tasks: n n n Workflow analysis n n Need to pre-classify activities Handheld/Tablet PC tools may be very helpful Asking users to ‘think aloud’ Unintended consequences of HIT 40
Qualitative Methodologies n Major techniques n n Adds richness to the evaluation n n Direct observations Semi-structured interviews Focus groups Explains successes and failures. Generate Lessons learned Captures the unexpected Great forming hypotheses People love to hear stories Data analysis n n Goal is to make sense of your observations Iterative & interactive 41
Cost Benefit Analysis n Cost Data n n n Financial Benefit Data n n n Generally available Caveat: allocation of indirect costs Revenue Enhancement Cost Avoidance Benefit Allocation n Benefits may accrue to multiple parties Are benefits realizable (e. g. labor savings)? Calculation of benefits to external parties may be of interest, even if it does not impact on ROI 42
Cost Benefit Analysis n Activity Based Costing Example n n n Simply put, a method for assigning costs to particular activities Alternate method of assigning indirect costs to the project Also, may serve as a framework for capturing cost savings Step* Example Identify activities n Paper chart maintenance Determine cost for each activity n Cost data for medical records Determine cost drivers n Number of chart pulls Obtain activity data n How many charts were pulled Calculate total cost n Savings from decreased pulls * http: //www. pitt. edu/~roztocki/abctutor/ 43
Concluding Remarks n Don’t bite off more than what you can chew n n It’s a practical world n n Pick a few study outcomes and study them well. Balancing operational and research needs is always a challenge. Life (data collection) is like a box of chocolates… n You don’t know what you’re going to get until you look, so look early! 44
Thank you n Eric Poon, MD MPH n n Email: epoon@partners. org Davis Bu, MD MA n Email: dbu@partners. org 45
Give Us Feedback! n We are eager to hear your feedback! n n Login with username and password n n Go to http: //extranet. ahrq. gov/rc Follow the links to provide feedback-thanks! Want to hear this teleconference again? n Dial 1 -800 -486 -4195 to replay until 5/4/05 46