Скачать презентацию Agenda n How Important Are Response Rates n Скачать презентацию Agenda n How Important Are Response Rates n

8d84f8ec8e562b73be682b20e3f08bb8.ppt

  • Количество слайдов: 43

Agenda n How Important Are Response Rates? n What Is Happening With Response Rates? Agenda n How Important Are Response Rates? n What Is Happening With Response Rates? n Measuring Response Rates n Does Any Of This Really Matter?

What is the Issue Regarding Response Rates? n Telephone survey response rates have been What is the Issue Regarding Response Rates? n Telephone survey response rates have been declining over the past few decades from a high of 60% in the early years. n Range of factors seen to contribute to declining rates: l Answering machines, voice mail, call blocking, caller ID, etc. l Refusals: time constraints, general cynicism, inconvenience, privacy and confidentiality concerns, etc. l Cell only households now becoming an issue – up to almost 10% in some areas of U. S. n Result of declining response rates? l High non-response = risk of lower quality data l Increased cost and time to reach target response rates l For some, “response rate” is seen as only measure of survey “quality”

Response Rate Not the Only Factor in Determining Survey Quality n Apart from Response Response Rate Not the Only Factor in Determining Survey Quality n Apart from Response Rates, There Are Many Other Factors Affecting Survey Quality l Sampling errors è Universe definition è Sample design è Sample source l Non-sampling errors è Data collection methods è Interviewers, coders, data processing è Respondent boredom è Analysis

How Important is “Response Rate”? n Higher response rates always desirable n But, response How Important is “Response Rate”? n Higher response rates always desirable n But, response rates should be only one consideration when research design and budgetary issues are considered l Avoid effects of other sources of error l Looking at research objectives, allocate resources where maximum benefit achieved l In many commercial surveys, response rate not even an issue (primarily quota samples) n Low response rates need not always be cause for concern l Key issue: how survey respondents differ from non-respondents l Bias from non-response will only be an issue when responders differ from nonresponders

What is Happening to Response Rates? n The PMRS Response Rate Committee measured refusal What is Happening to Response Rates? n The PMRS Response Rate Committee measured refusal rates in 1995, 1999, 2002 and again in 2005. Up until 2002, refusal rates have increased and response rates have fallen. n When analyzed on an increment basis year by year, the 2002 survey suggested that for one-time studies, the rate of refusals was accelerating. One-time Telephone Studies, Incidence 50% Plus February 1 – June 30 1995 1999 2002 2005 Refusal Rate 66% 68% 78% ? Response Rate 16% 17% 12% ? (Refusal Rate = Refusals/Total Asked; Response Rate = Cooperative Contacts/Total Eligible Numbers) Average Annual Increase in refusal rate per year 1995 – 1999 - 2002 0. 5% 3. 3% n Data for 2005 are not yet available so it is not clear whether this process has continued, although results I will present in a few minutes suggest average response rates may be in the 10% - 12% range in 2005/2006.

What is Happening to Response Rates? … cont’d n The longer the interview, the What is Happening to Response Rates? … cont’d n The longer the interview, the higher the refusal rate. 2002 data showed this impact very clearly. Aggregate Refusal Rate Interview Length (Minutes) <10 10 – 19 20+ 1995 50 59 68 1999 45 62 63 2002 65 74 80

Standardized Response Rate Calculation Standardized Response Rate Calculation

Why a Standard Method of Measuring Response Rates? n MRIA has recently adopted a Why a Standard Method of Measuring Response Rates? n MRIA has recently adopted a “Standard Method of Measuring Response Rates” as a result of a request from the Federal Government. n Literature reviews among a range of sources unearthed a myriad of “acceptable” definitions of Response Rate. The American Association for Public Opinion Research (AAPOR) alone publishes at least six different calculation methods that it deems to be acceptable under varying circumstances. n The goal for the Response Rate Committee became one of developing a response rate definition that would let research buyers compare levels of fieldwork effort and productivity across research suppliers. With this goal clearly in mind, the Committee endorsed a response rate calculation method that it considered to be the most appropriate for reporting call outcomes at the data collection stage of a telephone survey. n En route, the committee consulted with Statistics Canada and with members of AIRMS Quebec. Both groups endorsed the concept.

How Do We Measure Response Rates – MRIA Approved Definition Empirical Method of Response How Do We Measure Response Rates – MRIA Approved Definition Empirical Method of Response Rate Calculation Empirical Calculation for Data Collection Total Numbers Attempted Invalid NIS, fax/modem, business/non-res. Unresolved (U) Example (Every HHLD qualifies) 4000 1000 900 Busy, no answer, answering machine In-scope – non-responding (IS) 900 1050 Language problem Illness, incapable Selected respondent not available 100 50 100 Household refusal Respondent refusal Qualified respondent break-off In-scope – Responding units ( R ) 500 250 50 1050 Language disqualify No one 18+ Other disqualify Completed interviews Response Rate = R / (U + IS + R): 1050/900 + 1050 35%

High or Low Response Rates - Does it really matter? Presented to MRIA Annual High or Low Response Rates - Does it really matter? Presented to MRIA Annual Conference June 2006 by Gary Halpenny and Don Ambrose on behalf of MRIA Response Rate Committee

High or Low Response Rate – Does It Really Matter? n Telephone surveys have High or Low Response Rate – Does It Really Matter? n Telephone surveys have been under attack recently on the grounds that “Results are no longer accurate nor representative” l Low response rates are cited as the reason n However, a growing body of research begs to differ l A number of investigative projects in the U. S. have shown: è For most commercial and public opinion applications a 30% response rate produces essentially the same results as a 50% response rate

High or Low Response Rate – Does It Really Matter? … cont’d n Some High or Low Response Rate – Does It Really Matter? … cont’d n Some of the research literature: l In 1997, two identical surveys, one at 61% response rate and the other at 36%, produced no meaningful differences l This project was replicated in 2003 with 51% and 27% response rates and with similar results l Researchers concluded “carefully conducted polls with relatively low response rates still yield representative samples and accurate data” (Keeter el al, Pew Research)

High or Low Response Rate – Does It Really Matter? … cont’d n The High or Low Response Rate – Does It Really Matter? … cont’d n The reality today is that few commercial telephone surveys even approach the 30% level l The demand for faster turnaround means most telephone response rates are now in the 10% to 20% range l Quick 1 or 2 -day polls can yield even lower rates

The Critical Issue! n Can response rates at these levels still produce accurate and The Critical Issue! n Can response rates at these levels still produce accurate and meaningful data? n Clearly more research was needed

MRIA’s Research Project MRIA’s Research Project

The Plan n In 2005, the MRIA Response Rate Committee sponsored research to investigate The Plan n In 2005, the MRIA Response Rate Committee sponsored research to investigate whether response rates as low as 10% can still produce reliable and useful data. n Five Canadian research companies who regularly conduct national omnibus surveys volunteered to combine efforts.

Stage 1 n Using an identical 5 -minute question set, each company completed approximately Stage 1 n Using an identical 5 -minute question set, each company completed approximately 250 interviews on a single wave of its Omnibus in January, 2006. l 1, 238 completed interviews in total l 4 days in-field l 9% aggregate response rate

Stage 2 n Using the same 5 -minute question set, each company completed a Stage 2 n Using the same 5 -minute question set, each company completed a second sample of approximately 250 interviews over January/February 2006. l 1, 273 completed interviews in total l 4 -to-5 weeks in-field l First refusals recontacted l 31% aggregate response rate

Both Samples Were: n National RDD, age 18+ n Weighted to Census for: l Both Samples Were: n National RDD, age 18+ n Weighted to Census for: l Age l Gender l Province l Community size

Fieldwork Undertaken By: n Ipsos-Reid n Maritz n Opinion Search n Synovate n TNS-Canadian Fieldwork Undertaken By: n Ipsos-Reid n Maritz n Opinion Search n Synovate n TNS-Canadian Facts

Record of Call Comparison n Table on next slide indicates that additional call attempts Record of Call Comparison n Table on next slide indicates that additional call attempts yield three main benefits: l Higher contact ratio (lower proportion of busy/no answer) l Completion/refusal ratio increases from. 26 to. 77 l Means that fewer good telephone numbers required to yield same number of interviews

Disposition of Last Attempt 9% RR 31% RR 14, 832 4, 348 100% Busy/No Disposition of Last Attempt 9% RR 31% RR 14, 832 4, 348 100% Busy/No Answer 5, 843 780 Refused 4, 826 1, 647 Other Non-Responding 2, 820 569 Cooperative Respondents 1, 343 1, 352 9. 1% 31. 1% 105 79 1, 238 1, 273 Valid numbers attempted U IS R Response Rate = R / (U + IS + R) Disqualified Completed Interviews

Key Findings Key Findings

Both Studies Yield Identical Results for: n Incidence of food items used in past Both Studies Yield Identical Results for: n Incidence of food items used in past 6 months n List of items bought in last 12 months n Appliances in household n Print media readership – not title specific n Incidence of travel outside Canada n Personal access to the internet n Cell phone ownership and carrier used

Food Items Used in Past 6 Months n Results Identical 9% RR 31% RR Food Items Used in Past 6 Months n Results Identical 9% RR 31% RR Sig. Diff. * Eggs 97 96 N Cold Cereals 86 86 N Cheese (Not processed) 69 71 N Honey 67 66 N Frozen Pizza 55 55 N * At 90% level of confidence

Items Bought in the Last 12 Months n Same result regardless of whether category Items Bought in the Last 12 Months n Same result regardless of whether category incidence is high, medium or low 9% RR 31% RR Sig. Diff. Men’s or Women’s Clothing 93 93 N Sunscreen / Suntan Lotion 54 54 N Paint or Stain 52 51 N Camping Equipment 23 23 N Car Polish / Wax 21 20 N Traveler’s Cheques 8 8 N

Appliances in Household n Similar findings for both commonplace and more esoteric items 9% Appliances in Household n Similar findings for both commonplace and more esoteric items 9% RR 31% RR Sig. Diff. Microwave oven 95 95 N Automatic Dishwasher 63 61 N Gas BBQ 59 57 N Security System 34 37 N Espresso/Cappuccino Maker 14 13 N

Print Media Readership n Similar estimates of generic print media consumption 9% RR 31% Print Media Readership n Similar estimates of generic print media consumption 9% RR 31% RR Sig. Diff. - Yesterday 60 60 N - Past Week 84 84 N - Yesterday 38 40 N - Past Week 72 72 N Read a Daily Newspaper Last Time Read a Magazine

Traveled Outside Canada in Past 12 Months n Parallel results for both business and Traveled Outside Canada in Past 12 Months n Parallel results for both business and personal travel behaviour 9% RR 31% RR Sig. Diff. For Personal 32 33 N For Business 8 9 N

Personal Access to The Internet n Penetration levels virtually identical 9% RR 31% RR Personal Access to The Internet n Penetration levels virtually identical 9% RR 31% RR Sig. Diff. Any Access 76 76 N At Home 70 70 N At Work 45` 46 N

Cell Phones n No differences in either ownership incidence or carrier share 9% RR Cell Phones n No differences in either ownership incidence or carrier share 9% RR 31% RR Sig. Diff. 58 58 N Bell 29 28 N Telus 25 27 N Rogers 25 25 N Fido 6 6 N Other 11 11 N Has a Cell Phone Cellular Provider * * Base Total Cell Phone Owners

Credit Card Ownership and Usage n Difference are found here. l Higher response rate Credit Card Ownership and Usage n Difference are found here. l Higher response rate yields higher incidence of credit card ownership l Among card owners, high RR yields a higher incidence of owning American Express and a lower incidence of Master. Card è Posit that the higher RR captures a more upscale, harder-to-find group of people but not proven in the demos è Equally as likely to be a statistical anomaly l No differences in card used most often

Credit Cards Owned 9% RR 31% RR Sig. Diff. 78 82 +5 Visa 67 Credit Cards Owned 9% RR 31% RR Sig. Diff. 78 82 +5 Visa 67 70 N Master. Card 52 48 -4 American Express 13 18 +5 Diners 1 1 N Any Department Store 45 46 N Any Gasoline Company 15 14 N Average # of Cards Owned * 2. 4 2. 5 N Has any Credit Cards Specific Cards Owned * * Base: Total Credit Card Owners

Credit Cards Used Most Often n Claimed usage level unaffected by higher response rate. Credit Cards Used Most Often n Claimed usage level unaffected by higher response rate. Base = Owners of Credit Cards 9% RR 31% RR Sig. Diff. Visa 49 52 N Master. Card 30 29 N American Express 3 4 N Any Department Store Card 3 3 N Any Gasoline Company Card 1 1 N

12 Attitudinal Statements Measured n Mean scores the same on 11 attributes out of 12 Attitudinal Statements Measured n Mean scores the same on 11 attributes out of 12 n Difference on the statement related to shopping was statistically significant but would not have changed the interpretation

Attitudinal Statements 9% RR 31% RR Sig. Diff. I like to try new and Attitudinal Statements 9% RR 31% RR Sig. Diff. I like to try new and different products 6. 0 N I am willing to pay extra to save time 5. 4 N I lead a fairly busy social life 5. 9 6. 0 N A person’s career should be their 1 st priority 4. 9 N TV is a primary source of entertainment 5. 7 N I have more self-confidence than most people my age 6. 9 N I keep up-to-date with changes in style 5. 4 5. 3 N I am careful of what I eat 7. 2 N I go out with friends a great deal of the time 5. 1 5. 2 N To me shopping is a chore rather than a pleasure 6. 1 5. 9 - 0. 2 I prefer to postpone a purchase rather than buy on credit 6. 6 6. 7 N

Conclusions n Previous findings are corroborated – “carefully conducted polls with relatively low response Conclusions n Previous findings are corroborated – “carefully conducted polls with relatively low response rates still yield representative samples and accurate data” n Important that all other aspects of good survey design also must be present: l The set of telephone numbers is a randomly drawn, representative sample of the universe l Respondent selection at HH level is as random as possible l The data are weighted appropriately

Conclusions… cont’d n High response rates are still achievable for studies where this is Conclusions… cont’d n High response rates are still achievable for studies where this is an important design criterion l Fast field turnaround and high response rates are incompatible l Available time to complete the fieldwork is the main factor l More focus on the sample management process is required, e. g. call scheduling, elapsed time between attempts, etc.

Where Next? n Will repeat this test in January 2007. l Can the overall Where Next? n Will repeat this test in January 2007. l Can the overall findings be replicated? l Are the few data differences found real or merely random data anomalies n Modify the question set somewhat l Replace the attitudinal questions with questions related to public policy

Online Research Online Research

Online Surveys n Fastest growing methodology in North America n Primarily opt-in panels, but Online Surveys n Fastest growing methodology in North America n Primarily opt-in panels, but also client lists and pop-ups n Is “Response Rate” a valid term within this environment? l None of the standard criteria for true random sampling hold (unless we are doing a random sample of internet panel members) l What then do we use as measures of field effort and data quality

Online Surveys … cont’d n Lots of activity around online standards and Response Rates Online Surveys … cont’d n Lots of activity around online standards and Response Rates l ISO standards in process of development l MRIA standards developed l Response Rate Committee working with internet providers looking at data quality and measures of “success rate” for online surveys: n A. Total invitations (broadcast or pop-ups) n B. Undeliverables (nil in pop-ups) n C. Net usable invitations (c = a – b) n n n D. Total completes E. Qualified break-offs F. Disqualified G. Not responded H. Quota filled l Contact Rate = (d + e + f + h)/c l Success Rate = (d + f + h)/C

Conclusions n Response Rates continue to be of concern, and efforts to at least Conclusions n Response Rates continue to be of concern, and efforts to at least maintain current levels of respondent cooperation are needed n However, a well-designed and managed survey with a lower response rate is unlikely to result in a different management decision than would have been made if the response rate had been higher l Cost, time and overall research objectives must all be part of the decision process