PosterPDF Available

Abstract

Relevance & Research Question: Besides others, general attitudes towards surveys are part of respondent’s motivation for survey participation. These attitudes do predict participant’s willingness to perform supportively during (online) surveys (de Leeuw et al., 2017; Jungermann et al., 2019). Therefore, different attempts have been made to measure these general attitudes adequately. Most recently, the Survey Attitude Scale (SAS) as proposed by de Leeuw and colleagues (2019) compromises three dimensions: survey enjoyment, survey value, and survey burden. However, other research proposes additional dimensions such as survey reliability, survey privacy and survey intentions (Gengler et al., 2019; Looseveldt and Storms, 2008). Aiming to validate the SAS, we compare convergent and discriminant validity of different dimensions of attitudes towards surveys. Methods & Data: We included different items from the literature in a survey for higher education graduates in 2019 (gross sample n=3,345). Being randomly assigned, some respondents answered the items proposed by de Leeuw and colleagues (2019) (SAS). The others answered items that capture the dimensions proposed by Looseveldt and Storms (2008) and Gengler et al. (2019). We apply confirmatory factor analyses (CFA) to replicate the original scales and compare their fit indices and Cronbach’s Alpha for reliability. We than examine convergent and discriminant validity by correlating different dimensions. Finally, we run exploratory factor analyses (EFA) on the entire item set to assess potential for a more convincing scale. Results: First, CFA results indicate replicability of the proposed scales with our sample. However, the SAS model performs less effectively than the alternate model, yet the SAS dimensions demonstrate higher reliability scores. Secondly, as expected, we find high correlations between those items which intend to measure similar dimensions such as survey value and survey reliability or survey burden and survey privacy. Finally, EFA results do not support alternative dimensions in our data superior to existing scales. Added Value: The recommended scales are promising to measure attitudes towards surveys. Despite some weaknesses, the SAS instrument is valid and efficiently. As our validation study is based on a sample of highly qualified graduates, generalizing our results to the entire population should be taken with caution.
Measuring Attitudes towards Surveys:
A Validation Study
Relevance & Motivation
General attitudes towards surveys
are part of respondents’ motivation for survey participation
predict participants’ willingness to perform supportively during (online)
surveys (e.g. de Leeuw et al., 2017; Stocké, 2006)
Ulrike Schwabe (schwabe@dzhw.eu) | Thorsten Euler (euler@dzhw.eu) |
Isabelle Fiedler (fiedler@dzhw.eu) | Niklas Jungermann (niklas.jungermann@uni-kassel.de)
References:
De Leeuw, E., Hox, J., Lugtig, P., Scherpenzeel, C. V., Goritz, A., & Bartsch, S. (2010): Measuring and comparing survey attitude among new and repeat
respondents cross-culturally. In 63rd Annual Conference World Association for Public Opinion Research (WAPOR), Chicago.
De Leeuw, E., Hox, J., & Rosche, B. (2017): Survey attitude, nonresponse and attrition in a probability-based online panel. In International Workshop
on Household Survey Nonresponse.
De Leeuw, E., Hox, J., Silber, H., Struminskaya, B., & Vis, C. (2019): Development of an international survey attitude scale: Measurement equivalence,
reliability, and predictive validity. Measurement Instruments for the Social Sciences, 1(1), 1-10.
Gengler, J., Tessler, M., Lucas, R., & Forney, J. (2021). ‘Why Do You Ask?’ The Nature and Impacts of Attitudes towards Public Opinion Surveys in the
Arab World. British Journal of Political Science, 51(1), 115-136.
Loosveldt, G., & Storms, V. (2008). Measuring public opinions about surveys. International Journal of Public Opinion Research, 20(1), 74-89.
Stocké, V. (2006). Attitudes toward surveys, attitude accessibility and the effect on respondents’ susceptibility to nonresponse”. Quality & Quantity,
40(2), 259–288.
Stocké, V., & Langfeldt, B. (2003). Umfrageeinstellung und Umfrageerfahrung: die relative Bedeutung unterschiedlicher Aspekte der
Interviewerfahrung für die generalisierte Umfrageeinstellung. ZUMA Nachrichten, 27(52), 55-90.
Struminskaya, B., Bosnjak, M., de Leeuw, E., Lugtig, P., & Toepoel, V. (2015): GESIS Panel Core Study Module - Panel Survey Participation Evaluation &
Mode 65 Preferences. In: GESIS (Ed.): GESIS Panel Study Descriptions. Related to ZA5664 and ZA5665. Version 10.0.0. 127–129. Mannheim.
Summary of Results
(1) CFA results indicate replicability of the proposed scales with our sample. However,
the SAS model performs less accurate than the alternative model (ATS), yet two out
of three SAS dimensions demonstrate higher reliability scores (see Table 1 and 2).
(2) As expected, we find high correlations between those items which intend to
measure similar dimensions such as survey value and survey reliability or survey
burden and survey privacy (see Table 3).
(3) EFA results do not support alternative dimensions in our data that are superior to
existing ones (results not reported, but available on request).
Added Value
The recommended scales are promising to measure attitudes towards surveys. Despite
some weaknesses, the SAS instrument is valid and efficient. As our validation study is
based on a sample of highly qualified graduates, generalizing our results to the entire
population should be taken with caution.
Suggested Citation:
Schwabe, U., Euler, T., Fiedler, I., Jungermann, N. (2023): Measuring Attitudes towards Surveys: A Validation Study. General Online Research (GOR) Conference: Kassel.
Survey Attitude Scale (SAS): Nine-Item Model
(de Leeuw et al. 2010, 2019; own illustration; unstandardized factor loadings)
The Survey Attitude Scale (SAS)
differentiates between three dimensions
(de Leeuw et al., 2010, 2019; Struminskaya et al. 2015):
i. survey enjoyment (se)
ii. survey value (sv)
iii. survey burden (sb)
Other research proposes additional
dimensions (Gengler et al., 2021;
Looseveldt & Storms, 2008):
i. survey reliability (sr)
ii. survey privacy (sp)
iii. survey intentions (si)
se1 I enjoy answering questionnaires sent to me by mail or web. Survey
Enjoyment
0.81
0.82
Survey
Attitudes
0.80
0.79
se2 I enjoy being interviewed for surveys.
se3 I find surveys generally interesting.
sv1 In my opinion, surveys are important for society. Survey
Value
0.85
0.86
sv2 I think, useful information can be obtained from surveys.
sv3 In my opinion, participation in surveys is a waste of time.
sb1 I am asked way too often to participate in a survey. Survey
Burden
0.55
0.57
sb2 I perceive opinion surveys as an intrusion in my privacy.
sb3 I find it annoying to answer many questions in an interview.
Survey
Enjoyment
se1 se2 se3
Survey
Value
sv1 sv2 sv3
Survey
Burden
sb1 sb2 sb3
𝜀1𝜀2𝜀3𝜀4𝜀5𝜀6𝜀7𝜀8𝜀9
Attitudes Towards Surveys (ATS): Alternative Dimensions
(Gengler et al., 2021, Looseveldt & Storms, 2008; own illustration; unstandardized factor loadings)
Survey
Reliability
sr1 sr2 sr3
Survey
Privacy
sp1 sp2 sp3
Survey
Intentions
si1 si2 si3
𝜀1𝜀2𝜀3𝜀4𝜀5𝜀6𝜀7𝜀8𝜀9
sr1 Participants in surveys do their best to give truthful answers to the
questions they are asked. (Gengler et al., 2021) Survey
Reliability
0.58
0.58
Attitudes
Towards
Surveys
0.78
0.76
sr2 Most institutions that conduct public opinion surveys work hard to
make their surveys as accurate as possible. (Gengler et al., 2021)
sr3 If a survey is well done, it will give very accurate information about the
views of the people surveyed. (Gengler et al., 2021)
sp1 I sometimes hesitate about taking part in a survey because I do not
know what will happen with my replies. (Loosveldt & Storms, 2008) Survey
Privacy
0.79
0.82
sp2 Surveys tend to include questions that are too personal. (Loosveldt &
Storms, 2008)
sp3 Surveys often ask something that is no one's business. (Stocké & Langfeldt,
2003, own translation)
si1 The results of surveys are usually heavily influenced by the personal
interests of political preferences of the people conducting the
research. (Gengler et al., 2021) Survey
Intentions
0.69
0.70
si2 Surveys are frequently used to manipulate and mislead people. (Gengler
et al., 2021)
si3 I have the impression that surveys are always designed to produce the
desired results. (self-development, own translation)
1 0.93 0.50 0.43 1 0.97 -0.50 0.57 1 1.17 0.93
1 1.48 0.87 1 1.14 1.13 1 1.13 1.15
RMSEA 0.072
SRMR 0.046
CFI 0.967
RMSEA 0.054
SRMR 0.033
CFI 0.971
Methods
Different items from the literature included in a modularized German survey for
higher education graduates in 2019.
Respondents in the module “survey attitudes” answered
items capturing survey enjoyment, survey value and survey burden (SAS)
(n=1,369, >1% item non-response).
items capturing survey reliability, survey privacy and survey intentions (ATS)
(n=1370, about 2-3% item non-response).
Validation study
1) Replication of the original scale(s) applying confirmatory factor analyses (CFA),
reliability using Cronbach’s Alpha and McDonald’s Omega.
2) Convergent and discriminant validity by correlating different dimensions.
3) Exploratory factor analysis (EFA) using the entire item set (results not reported).
Data: DZHW graduate panel 2009 (3rd wave)
panel of higher education graduates
from German HEI, established in
2009.
module “survey attitudes”
measured in 2019.
gross sample n = 3,345 participants
(response rate: 66.5 %).
analytical sample for
SAS: n = 1,304
ATS: n = 1,287.
Survey
Enjoyment
Survey
Value
Survey
Burden
Survey
Reliability
Survey
Privacy
Survey
Intentions
Survey
Enjoyment
1.00
Survey Value 0.31 1.00
Survey
Burden
-0.54 -0.58 1.00
Survey
Reliability
0.39 0.78 -0.58 1.00
Survey
Privacy
-0.29 -0.43 0.86 -0.48 1.00
Survey
Intentions
-0.29 -0.54 0.72 -0.77 0.71 1.00
Table 1: Survey Attitudes Scale (de Leeuw et al., 2010, 2019): Item wording and reliability
scores (Cronbach’s Alpha and McDonald’s Omega) for three dimensions. Own calculations.
Table 2: Attitudes Towards Surveys: Item wording and reliability scores (Cronbach’s Alpha and
McDonald’s Omega) for three dimensions. Own calculations.
Table 3: Pearson correlations between the different dimensions of the SAS and the ATS (all
significant at p < 0.001). Bold: correlations > 0.7. Own calculations.
ResearchGate has not been able to resolve any citations for this publication.
Article
This paper analyzes whether respondents� attitudes toward surveys explains their susceptibility to item nonresponse. In contrast to previous studies, the decision to refuse to provide income information, not to answer other questions and the probability of �don�t know� responses is tested separately. Furthermore, the interviewers� overall judgments of response willingness was included as well. Respondents with a positive and cognitively accessible attitude toward surveys were expected to adopt a cooperative orientation and were thus deemed more likely to answer difficult as well as sensitive questions. Attitudes were measured with a 16-item instrument and the response latencies were used as an indicator for attitude accessibility. We found that respondents with more favorable evaluations of surveys had lower values on all kinds of nonresponse indicators. Except for the strong effect on the prevalence of don�t knows, survey attitudes were increasingly more predictive for all other aspects of nonresponse when these attitude answers were faster and thus cognitively more accessible. This accessibility, and thus how relevant survey attitudes are for nonresponse, was found to increase with the subjects� exposure to surveys in the past.
Article
The following paper examines the degree to which response speed in answering attitude-questions can be regarded as a valid indicator for the respondent’s attitude-strength and as a reliable predictor for their susceptibility to question-order effects. First, the response-latencies’ convergent validity can be demonstrated. As expected, these correlate significantly with the extremity of the attitude-responses and the reported response-certainty as 'conventional' measures of the attitude-strength. Furthermore, we find clear evidence for the construct-validity of the response latencies. In a split-ballot experiment the respondents were asked about their attitudes towards a complete liberalization of the abortion law either before or after they answered a question about the termination of pregnancy as a result of rape. We find contrast-effects in both items, the respective strength of which can be predicted to a significant degree by the response speed of the respondents. In comparison, the two alternative indicators of attitude-strength are found either to be irrelevant in this respect or to lose their explanatory power if the relevance of the response speed is controlled at the same time. In summary, the respondents are found to be only influenced by question-order effects if their response speed and thus their attitude-strength drops below a certain threshold.