6dac3964521c5302522da0237091d449.ppt
- Количество слайдов: 20
The evaluation of an open access self-help web-site Delyth Lloyd 1 & Chris Clarke 2 1 Australian Centre for Posttraumatic Mental Health 2 Defence Links, Department of Veterans’ Affairs A centre of excellence supported by the Australian Government © Copyright 2011
Acknowledgements Andrea Phelps John O’Connor Funded by Department of Veterans’ Affairs SMS Management Technology
Outline • • Background The web-site & audience Steps in the development of the evaluation Factors shaping the evaluation • Data sources and IT capacity • Policy cycle • Challenges, lessons learned • Your thoughts?
Context Department of Veterans’ Affairs ‘lifecycle’ program • Aim: Provide an on-line Wellbeing resource tailored for veterans, former serving members and their families. • Target hard-to-reach / hard-to-engage • Adding to the range of mental health and support services available www. wellbeingtoolbox. net. au
Wellbeing Toolbox Skills for Psychosocial recovery • Generic distress reduction skills • Principles based on risk & resilience research • Broad, accessible, tried and tested Target population • Sensitive to information request • “Younger” (<50 y) • Difficulties with government involvement in health
Open Access • Anyone can visit • Personalised journey – increase relevance • Ease of access – find what you want quickly • • Log in (optional) Questionnaire (optional) Modular (with guide me function) Self management plan (optional) Ø accessibility priority Ø low impact non intrusive evaluation
Planning 1. Literature review – what can we learn from others? • • • Need is present in target group On-line self-help “courses” can work Disorder specific evidence Organisational evidence Youth orientated products successful Genuine open access self-help evaluations with accessibility focus • Not done? • Not reported?
Planning (cont’d) Utilisation discussions • Who is going to use the evaluation, why, how? • What do they need to know? • What would they like to know? • When? Program Logic • Not a clinical TOC from use to better wellbeing • Path of use • Identify and agree on evaluation questions
Actions Usability testing & stakeholder consultation ensures appealing and usable to target group Web-site created and promoted Must be clinically sound Must be targeted at veterans and also relevant for family members Must contain Key features named in contract Processes Outcomes Who/How many users re-visit the site? Users from target group become aware of website Via KIT Via direct marketing of selfcare site navigate there independently from web-search from other sites e. g. At ease How do users reach the site? How effective were promotion marketing activities? Who are the “users”? Users visit SC web-site Users perceive as useful One time visitors Re-visit Who/How many users choose sign in? Who/How many users re- visit the Self Management plan? Sign in Use What do first time users think? (many will be `1 time’ users) What sections are visited by the most people? (indicates interest) What sections do people revisit most often? (indicates which are most helpful) Note: Module selections may be prompted to some extent if the ‘guide me’ function is used Use self – Management Plan Use web-site components to greater or lesser extent depending on individual factors Users self-refer to other services and resources as appropriate Have site users seen anyone about their problems since using the site or do they intend to do so? Have users followed links to other recommended resources? Re-use Use of other services & resources Positive feedback General Feedback: What do users like best about the site & what could be improved?
Factors shaping the evaluation 1: Data Available • What can / could the site do? • e. g. reminders, return visits • What evaluation tools can be built in? • What can Google Analytics do? • Ethics? What is it okay to know? • What we don’t/won’t/can’t know? • e. g. questionnaire and use, but not both • Logged in versus not logged in
Actions Usability testing & stakeholder consultation ensures appealing and usable to target group Web-site created and promoted Must be clinically sound Must be targeted at veterans and also relevant for family members Must contain Key features named in contract Processes Outcomes Who/How many users re-visit the site? Users from target group become aware of website Via KIT Via direct marketing of selfcare site navigate there independently from web-search from other sites e. g. At ease How do users reach the site? How effective were promotion marketing activities? Who are the “users”? Users visit SC web-site Users perceive as useful One time visitors Re-visit Who/How many users choose sign in? Who/How many users re- visit the Self Management plan? Sign in Use What do first time users think? (many will be `1 time’ users) What sections are visited by the most people? (indicates interest) What sections do people revisit most often? (indicates which are most helpful) Note: Module selections may be prompted to some extent if the ‘guide me’ function is used Use self – Management Plan Use web-site components to greater or lesser extent depending on individual factors Users self-refer to other services and resources as appropriate Have site users seen anyone about their problems since using the site or do they intend to do so? Have users followed links to other recommended resources? Re-use Use of other services & resources Positive feedback General Feedback: What do users like best about the site & what could be improved?
Data – sources and strategies Google Analytics Visitors/ unique visitors/ page views/ time on site Rating scales within the topics How helpful was this module to you? (optional) User survey “pop-up” Demographics, brief ratings, comment Feedback (ad hoc) users and non-users Evaluation Register (structured interviews) Users and stakeholders
Benchmarking • What will it all mean? • “ 2685 people visited module X”…. So what? • • user numbers (proportion of sister site) time on site (one open access depression – 20 mins) extent of use (proportion start, mid, end) log-in rates (? 1. 6%/ 35%/25 -90%) • Revise and refine benchmarks over time • Link findings to recommendations > Early indicators / likely trajectories > if this then what?
Factors shaping the evaluation 2: Policy Processes & Budget Cycle What info at what time point: • Reassurance – now > mid trial report • Decision making - soon • Accountability & bigger picture- later Challenges of providing this as well as: • Rigorous evaluation of new concept • Useful • Ethical • Practically possible & not overly demanding on users
Progress Key questions: Reach, Acceptability, Benefit What next ? Monitoring component (usage patterns over time) Benchmarks (meaning) Feedback (ad hoc complaints and compliments) Interviews (user stories) > Decision making & meaning oriented report
Conclusions & our throughs so far. . 1. Evaluation needs to be incorporated in design of web products. Privacy and Ethics are grey areas 2. Lessons/ tactics for a meaningful but low impact evaluation design. Trade-off between purpose of site and ease of evaluation. 3. Orienting reporting to stakeholder needs, timing, broader implications; what is reasonable to conclude mid way?
Questions, comments?
6dac3964521c5302522da0237091d449.ppt