Complementing Random-Digit-Dial Telephone Surveys with Other Approaches to Collecting Sensitive Data

Joint Program in Survey Methodology, University of Maryland, College Park, Maryland 20742, USA.
American Journal of Preventive Medicine (Impact Factor: 4.28). 12/2006; 31(5):437-43. DOI: 10.1016/j.amepre.2006.07.023
Source: PubMed

ABSTRACT Surveys of sensitive topics, such as the Injury Control and Risk Surveys (ICARIS) or the Behavioral Risk Factors Surveillance System (BRFSS), are often conducted by telephone using random-digit-dial (RDD) sampling methods. Although this method of data collection is relatively quick and inexpensive, it suffers from growing coverage problems and falling response rates. In this paper, several alternative methods of data collection are reviewed, including audio computer-assisted interviews as part of personal visit surveys, mail surveys, web surveys, and interactive voice response surveys. Their strengths and weaknesses are presented regarding coverage, nonresponse, and measurement issues, and compared with RDD telephone surveys. The feasibility of several mixed mode designs is discussed; none of them stands out as clearly the right choice for surveys on sensitive issues, which implies increased need for methodologic research.

Download full-text


Available from: Roger Tourangeau, Aug 13, 2014
  • Source
    • "As stated earlier, IVR approaches do not utilize live interviewers who provide barriers to dropping out through psychological factors, such as authority (e.g., people usually find it rude to hang-up on an interviewer once engaged) and reciprocity (e.g., interviewers can provide additional encouragement and/or feedback to respondents to keep them engaged in the process) (Groves et al., 1992). Respondent fatigue for longer IVR surveys may also increase dropout rates due the lack of interviewer barriers and/or respondent boredom with an automated system (Galesic et al., 2006). Survey completion patterns observed with IVR typically result in an initial dropout during transition to the automated system continued by dropout throughout the survey (compared to only an initial dropout observed with CATI), suggesting respondent fatigue and/or reaction to the IVR modality may result when interviewer barriers are removed (Galesic et al., 2006; Kreuter et al., 2008; Tourangeau et al., 2002). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Few methods estimate the prevalence of child maltreatment in the general population due to concerns about socially desirable responding and mandated reporting laws. Innovative methods, such as interactive voice response (IVR), may obtain better estimates that address these concerns. This study examined the utility of interactive voice response (IVR) for child maltreatment behaviors by assessing differences between respondents who completed and did not complete a survey using IVR technology. A mixed-mode telephone survey was conducted in English and Spanish in 50 cities in California during 2009. Caregivers (n=3,023) self-reported abusive and neglectful parenting behaviors for a focal child under the age of 13 using computer-assisted telephone interviewing and IVR. We used hierarchical generalized linear models to compare survey completion by caregivers nested within cities for the full sample and age-specific ranges. For demographic characteristics, caregivers born in the United States were more likely to complete the survey when controlling for covariates. Parenting stress, provision of physical needs, and provision of supervisory needs were not associated with survey completion in the full multivariate model. For caregivers of children 0-4 years (n=838), those reporting they could often or always hear their child from another room had a higher likelihood of survey completion. The findings suggest IVR could prove to be useful for future surveys that aim to estimate abusive and/or neglectful parenting behaviors given the limited bias observed for demographic characteristics and problematic parenting behaviors. Further research should expand upon its utility to advance estimation rates.
    Child abuse & neglect 05/2014; 38(10). DOI:10.1016/j.chiabu.2014.04.001 · 2.34 Impact Factor
  • Source
    • "But this does not quite answer the question about the actual accuracy of what is provided. Galesic et al., (2006) suggested that the absence of interviewers in web surveys may reduce social desirability bias but again there is no guarantee that respondents will report accurately and there is even less of a change for survey administrators to perform validation and verification. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Surveys have long become a standard data collection tool. In more recent times web surveys have become particularly popular given their cost-effectiveness on one hand, and the ubiquitous nature of the internet and the World Wide Web on the other. Regardless of the type of survey, a number of issues still challenge survey researchers and practitioners. The accuracy of data collected for socio-demographic and other factual –type research questions is of utmost importance if the researcher is to make any claim about the data collected. Accuracy as characteristic of data quality is perhaps the most important issue of all. This paper specifically reviews the body of literature on work completed on data quality and identifies and analyzes studies on accuracy and reliability of data. This paper critically examines the most significant published research in the literature that addresses the issue of accuracy and reliability of survey data. Specifically, this review addresses and critique existing research methods, identifies and discusses important limitations, and concludes with a discourse on key questions to be answered and suggestions for future research.
  • Source
    • "Impersonal surveys like on-line or postal modes of questionnaire administration mitigate these effects. A number of studies, including Galesic et al (2006), Parks et al (2006), and Bowling (2005), point out that respondents are more likely to report sensitive information in impersonal settings like self-administered postal or on-line surveys. Bowling (2005) points out that " self-administration methods increase perceived impersonality and may encourage reporting of some sensitive information (e.g. in interview situations there may be fear of embarrassment with the exposure of weakness, failure or deviancy in the presence of a stranger). "
    [Show abstract] [Hide abstract]
    ABSTRACT: On-line data collection is becoming a popular alternative to traditional survey data collection methods like face-to-face and telephone interviews. Although online sur-veys have several advantages over other data collection methods, like lower cost, the use of online survey data collection methods still has considerable resistance among researchers. To date, online surveys have not been used much to collect data on gam-bling behavior. We compare the results of a random digit dial and online survey of gambling prevalence that was administered in Alberta in 2008. Estimates of gambling participation rates and attitudes about gambling differ systematically across the two samples; the reported gambling participation rates are uniformly higher, and the atti-tudes about gambling uniformly more favorable in the on-line sample compared to the RDD sample. Correcting for differences in the observable demographic and economic characteristics of the on-line sample by weights based on propensity score matching had no affect on the difference in estimated gambling participation rates across the two samples, indicating that sampling bias does not explain the differences.
Show more