[show abstract][hide abstract] ABSTRACT: In this study, the authors sought to determine the effects of length and clarity on response rates and data quality for two food frequency questionnaires (FFQs): the newly developed 36-page Diet History Questionnaire (DHQ), designed to be cognitively easier for respondents, and a 16-page FFQ developed earlier for the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial. The PLCO Trial is a 23-year randomized controlled clinical trial begun in 1992. The sample for this substudy, which was conducted from January to April of 1998, consisted of 900 control and 450 screened PLCO participants aged 55-74 years. Controls received either the DHQ or the PLCO FFQ by mail. Screenees, who had previously completed the PLCO FFQ at baseline, were administered the DHQ. Among controls, the response rate for both FFQs was 82%. Average amounts of time needed by controls to complete the DHQ and the PLCO FFQ were 68 minutes and 39 minutes, respectively. Percentages of missing or uninterpretable responses were similar between instruments for questions on frequency of intake but were approximately 3 and 9 percentage points lower (p < or = 0.001) in the DHQ for questions on portion size and use of vitamin/mineral supplements, respectively. Among screenees, response rates for the DHQ and the PLCO FFQ were 84% and 89%, respectively, and analyses of questions on portion size and supplement use showed few differences. These data indicated that the shorter FFQ was not better from the perspective of response rate and data quality, and that clarity and ease of administration may compensate for questionnaire length.
American Journal of Epidemiology 02/2001; 153(4):404-9. · 4.78 Impact Factor
[show abstract][hide abstract] ABSTRACT: In large clinical trials where outcome assessment is possible using questionnaires, it may be more cost-effective to mail them to patients than to conduct interviews in-person. However, nonresponse to mailed questionnaires reduces the effective sample size and can introduce bias. We conducted a systematic review and meta-analysis of randomized controlled trials evaluating the effect of questionnaire length on response rates. We searched 14 electronic bibliographic databases, the reference lists of relevant trials, and we contacted the authors of eligible trials to ask about unpublished data. For each trial identified, we used logistic regression to estimate the odds ratio for response per one page increase in the number of pages included in the questionnaire. We pooled the regression coefficients in a random effects meta-analysis. Heterogeneity among the coefficients was assessed using a chi-square test at a 5% significance level. We specified a priori that the reduction in the odds of response per one page increase would be greatest among trials comparing relatively short questionnaires. We used meta regression to examine the relationships between the regression coefficients, the length of the questionnaires used in each trial, and other study characteristics. A total of 38 randomized controlled trials were identified where participants were allocated to questionnaires of differing lengths and where the number of pages used was known. There was significant heterogeneity between the regression coefficients estimated from each trial. In meta regression, most of the heterogeneity was explained by variation in the length of the questionnaires used in each trial. Among trials in which the shortest questionnaire was a postcard, the odds of response were more than halved for each additional page used (0.39; 95% CI 0.34 to 0.45). In the remaining trials, pooled effect sizes were much smaller. In trials of one page compared with either two or three pages, the odds of response per one page increase was 1.01 (95% CI 0.82 to 1.24). For one page compared with four or more pages, and for two or more pages compared with longer alternatives, the odds ratios per one page increase were 0.90 (95% CI 0.83 to 0.98) and 0.98 (95% CI 0.96 to 0.99), respectively. There were no statistically significant associations between trial results and other study characteristics. It appears that response can be increased by using a shorter questionnaire. Moderate changes to the length of shorter questionnaires will be more effective than moderate changes to the length of longer questionnaires. If a choice of follow-up questionnaire exists for a clinical trial, the shorter one should be used. If a new follow-up questionnaire is to be designed, it should be made as short as possible without compromising the data collection requirements of the trial.
[show abstract][hide abstract] ABSTRACT: Web-questionnaires are an important tool for future epidemiological research because these allow for rapid and cost-efficient assembly of self-reported information on risk factors and health outcomes. However, to achieve high response rates it is essential to accommodate factors that prevent drop out and so insure validity of future studies. We aim to study how socio-demographic variables as well as design issues such as the ordering and level of difficulty (Easy-to-hard vs. Hard-to-easy) of questions in a web-questionnaire affects the probability of drop out and non-response.
In 2003 we invited 47,859 women participating in an ongoing prospective study to a follow-up using a web-based mode. Two versions of the questionnaire existed, varying in level of difficulty (Easy-to-hard vs. Hard-to-easy). We report drop out (proportion non-completers) between groups defined by level of difficulty and estimated adjusted risk differences.
The drop out differs significantly depending on the order of the questions in the web-questionnaire. The socio-demographic pattern among lurkers (participants that enter, start responding to, but do not complete a web-questionnaire) differs from that among completers of web-questionnaires.
An additional 6% units of completers--persons initiating and completing the questionnaire--can be obtained by considering the ordering of questions. A group uniquely identified in web-surveys, as lurkers are potentially easier to persuade to complete an already started web-questionnaire compared to a non-responder. Lurkers thus constitute a unique opportunity of decreasing the drop out rate and therefore merit future research.
European Journal of Epidemiology 02/2007; 22(5):293-300. · 5.12 Impact Factor
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.