PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract

Survey data collection is shifting towards the digital world, including online surveys successively replacing other well-established survey data collection methods. However, online surveys often result in low response rates and face other recruitment limitations, especially when it comes to rare populations. Thus, survey researchers and practitioners increasingly consider social media platforms for recruiting participants. Although social media recruitment provides access to an untapped and diverse participant pool, there are no best-practices or protocols ensuring sound participant recruitment. In this study, we contribute to the current state of research by investigating the effects of social media ad design on ad performance (e.g., survey completions), recruitment costs (e.g., costs per survey completion), and platform user engagements (e.g., number of comments). This is done in a cross-national research setting using social media ads that were launched in Germany and the USA in 2023. Our findings show differences across ad designs and countries. Although costs per survey completion are substantially higher for the USA, ads with strongly topic-related images result in the lowest costs per survey completion in Germany and the USA. At the same time, ads with strongly topic-related images result in the highest number of comments in both countries. Finally, our study provides novel insights into the design of social media ads for online survey recruitment in a cross-national research setting and offers valuable information on their effectiveness, efficiency, and monitoring and moderating demand.
1
Social Media Ads for Survey Recruitment: Performance, Costs, User Engagement
1
Jan Karem Höhne
German Centre for Higher Education Research and Science Studies (DZHW)
Leibniz University Hannover
Joshua Claassen
German Centre for Higher Education Research and Science Studies (DZHW)
Leibniz University Hannover
Simon Kühne
Bielefeld University
Zaza Zindel
German Centre for Integration and Migration Research (DeZIM)
Abstract
Survey data collection is shifting towards the digital world, including online surveys
successively replacing other well-established survey data collection methods. However, online
surveys often result in low response rates and face other recruitment limitations, especially
when it comes to rare populations. Thus, survey researchers and practitioners increasingly
consider social media platforms for recruiting participants. Although social media recruitment
provides access to an untapped and diverse participant pool, there are no best-practices or
protocols ensuring sound participant recruitment. In this study, we contribute to the current state
of research by investigating the effects of social media ad design on ad performance (e.g.,
survey completions), recruitment costs (e.g., costs per survey completion), and platform user
engagements (e.g., number of comments). This is done in a cross-national research setting using
social media ads that were launched in Germany and the USA in 2023. Our findings show
differences across ad designs and countries. Although costs per survey completion are
substantially higher for the USA, ads with strongly topic-related images result in the lowest
costs per survey completion in Germany and the USA. At the same time, ads with strongly
topic-related images result in the highest number of comments in both countries. Finally, our
study provides novel insights into the design of social media ads for online survey recruitment
in a cross-national research setting and offers valuable information on their effectiveness,
efficiency, and monitoring and moderating demand.
Keywords: Cross-national study, Facebook, Nonprobability samples, Social media, Online
survey recruitment, Digital trace data
This document is a preprint and thus it may differ from the final version.
2
Introduction and research questions
With the rise of online surveys, the collection of survey data has undergone a major
transformation. Compared to other established survey methods, such as in-person and telephone
interviews, online surveys offer significant advantages in terms of cost- and time-efficiency
(Callegaro et al., 2015; Schober, 2018). However, they also present new challenges, including
low response rates (Daikeler et al., 2020; Lozar Manfreda et al., 2008) and sampling biases
(Revilla & Höhne, 2020). Participant recruitment remains especially difficult when no sampling
frames exist (Schonlau & Couper, 2017), forcing both survey researchers and practitioners to
rely on non-probability samples, such as online access panels or river sampling approaches.
While these methods are much more affordable than probability-based sampling, they still
require substantial financial and time investments prior to fieldwork and data collection. For
instance, this includes finding and hiring an online panel, reconciling questionnaire
programming and testing, setting specific thresholds when it comes to participant
characteristics, and transferring fieldwork metrics and survey data.
Social media platforms, such as Facebook, present a promising way for efficiently
recruiting online survey participants (see Zindel (2023) for a detailed review of existing
research). Social media recruitment is based on digital ads – comprising a brief text, an image
or video, and an online survey link – to invite platform users to participate in online surveys.
These ads can be launched at any time without much time in advance (Pötzschke et al., 2023).
When clicking on the link in the ad, users are directed to the online survey for participation.
Importantly, ads on these platforms can be targeted to specific participant groups, creating a
form of quota sampling. Thanks to the billions of platform users, social media recruitment offers
an easy and efficient way to access a broad and diverse group of participants, including hard-
to-reach or uncommon populations, such as individuals with specific health conditions or sexual
minorities (Kühne & Zindel, 2020; Pötzschke et al., 2023; Zindel, 2023).
Despite growing interest in social media recruitment, best practices for ad design remain
unclear and there is a lack of empirical-driven fieldwork guidelines or protocols. In particular,
little is known about whether and to what extent ads should include images that are thematically
related to the online survey topic or not (Donzowa et al., under review). This is accompanied
by a lack of studies informing about ad performance (e.g., survey completions) and recruitment
costs (e.g., costs per survey completion). Moreover, existing research has largely ignored user
engagements with social media ads (e.g., number or content of comments). This is surprising
for at least three reasons. First, user engagements are likely to affect ad performance, by
encouraging or discouraging other users to click on the ad for online survey participation.
Second, user engagements with ads need to be monitored throughout the fieldwork (Kühne &
Zindel, 2020) to, for instance, detect (and delete) hateful comments. Third, the quantity (e.g.,
number of comments) and quality (e.g., disapproval of the survey topic) of user engagements
may serve as a control measure for the recruitment campaign and potentially signal problematic
ads or other fieldwork issues. Finally, only few studies have examined social media recruitment
across countries (Donzowa et al., under review; Zindel et al., 2023). One exception is the cross-
national study by Donzowa et al. (under review) reporting that the design of social media ads
indeed affects data quality in terms of attrition, item-nonresponse, and non-differentiation.
This study addresses the previously described gaps in the survey literature by
investigating social media recruitment in a cross-national context. Using data from two identical
3
online surveys that were simultaneously conducted through Facebook ads in Germany and the
USA in 2023, we examine 1) ad performance, 2) recruitment costs, and 3) user engagements
with ads across different image designs. The online surveys dealt with immigration and climate
change and were promoted through images with a strong relation, loose relation, or no relation
(neutral design) to these topics. Unlike previous studies, we also analyze social media
engagement data, such as ad-related comments. We address the following three research
questions (RQs):
How do strongly topic-related, loosely topic-related, and neutral images in social media
ads in Germany and the USA differ with respect to …
… ad performance? (RQ1)
… recruitment costs? (RQ2)
… user engagement? (RQ3)
Addressing these research questions provides practical insights into the optimization of
ad design for social media campaigns in the context of participant recruitment. Specifically, our
findings contribute to better understanding campaign effectiveness (performance, RQ1),
campaign efficiency (recruitment costs, RQ2), and fieldwork monitoring and comment
moderation (user engagements, RQ3). By offering novel insights, this study serves as a basis
for future research on social media recruitment. In addition, its cross-national scope
distinguishes it from previous studies, making a valuable contribution to comparative survey
research.
Method
Data collection and participant recruitment
We conducted two self-administered online surveys, recruiting participants through Facebook
ads displayed in users’ newsfeed. The first online survey took place in Germany from June 25,
2023, to July 2, 2023, while the second one, featuring the same questions translated into
English, took place in the USA from June 25, 2023, to July 3, 2023. Both online surveys focused
on immigration and climate change. Each ad consisted of an image, a brief text-based invitation
to participate in the online survey, and a link directing users to the online survey. On the first
online survey page, participants received information about the study procedure, topic, expected
duration of the online survey, and adherence with existing data protection laws and regulations.
Participation was voluntary without the provision of any incentives.
Ad images
We tested three types of Facebook ads, which differed only with respect to the image used:
Images that were strongly related to immigration or climate change (strong topic-relation),
images that were loosely related to immigration or climate change (loose topic-relation), and
images with no relation to immigration or climate change (neutral design). In both countries,
we launched ads featuring six strongly topic-related images, six loosely topic-related images,
and three neutral images. The same images were used in Germany and the USA, except for two
images containing text (German vs. English) and two images featuring flags (EU vs. USA) that
were adapted for each country. Figure 1 presents examples of the ad images for Germany, the
4
USA, and both countries. The Supplementary Online Material 1 (SOM1) includes an overview
of the remaining images used in this study.
We utilized a three-step approach to select the ad images (see Donzowa et al., under
review). First, we compiled an initial set of 73 images from AdobeStock
(https://stock.adobe.com/) and iStock (https://www.istockphoto.com/). Second, we conducted
a survey among expert researchers in our scientific network, asking them to classify each image
as strongly topic-related, loosely topic-related, or not topic-related. Third, we conducted a social
media campaign pretest using the images in Facebook ads for participant recruitment. To assess
image performance, we calculated the ratio of unique user accounts reached to link clicks for
each ad. Based on these results, we selected the top six strongly topic-related, the top six loosely
topic-related, and the top three neutral images for the final study.
Figure 1. Examples of the images used for the Facebook ads
Note. The ads included the following text (USA version): “Your opinion is wanted! Take our short online survey
on the diversity of attitudes and opinions in the USA and be part of a nationwide study!”
Analysis
In a first step, we analyze key ad campaign and online survey metrics: ad performance in terms
of numbers of unique user accounts reached, unique link clicks, survey starts, and survey
completions as well as recruitment costs in terms of total costs, costs per unique user account
reached, unique link click, survey start, and survey completion. Moreover, we look at user
engagements with ads in terms of number of likes (e.g., thumb ups), reactions (e.g., emojis),
shares (e.g., with platform friends), saves (e.g., for revisiting at a later point), and comments
(e.g., on the online survey topic). Importantly, for each of these metrics, we report mean values
across neutral, loosely, and strongly topic-related ads.
In addition, we retrieved the comments from all Facebook ads using the Meta Graph API,
yielding 301 comments from Germany and 578 comments from the USA. To this end, we
authenticated with the API using a personalized access token to retrieve ad content. We then
sent a GET request to the appropriate endpoint to obtain comments associated with the specific
ads. In return, we retrieved structured data in the JSON format. The retrieved comments were
5
manually coded by a student assistant: 1) “Positive survey reactions” (e.g., “Excellent survey”),
2) “Negative survey reactions” (e.g., “This survey is just a scam”), 3) “Topic reactions” (e.g.,
“Our country is doomed”), 4) “Offensive content” (e.g., “Fuck you liberal assholes”), and 5)
“Unrelated content and nonsense” (e.g., “Uh oh!”). For comments that exclusively contained
images or graphic interchange formats (GIFs) instead of text, we used an additional code: 6)
“Images and GIFs.” Importantly, we utilized an inductive coding approach and developed the
coding scheme based on the data instead of using preconceived codes. If a comment matched
multiple categories, multiple codes were assigned. The second author re-coded a random subset
of 10% (n = 88) of the comments. In case of disagreement, the second author and the student
assistant reached a consensus through discussion, resulting in refined coding instructions. Based
on the refined instructions, the student assistant went through all comments again and assigned
the final codes.
Given the comparative nature of the study, we report results descriptively using
frequencies, costs (Euro), and percentages.
Results
In Germany, 71,609 user accounts were reached by the online survey ads. Of these, 5,720 users
clicked on the link and visited the first online survey page. In total, 4,170 participants started
the online survey by proceeding to the second page, and 2,495 participants completed the entire
online survey. The overall costs for the Facebook ads in Germany were 890 Euros. In the USA,
in contrast, 149,183 user accounts were reached by the online survey ads. Of these, 10,832 users
clicked on the link and visited the first online survey page. In total, 5,469 participants started
the online survey by proceeding to the second page and 2,520 participants completed the entire
online survey. The overall costs for the Facebook ads in the USA were 3,457 Euros
1
. We provide
further information on the German and USA samples in the Supplementary Online Material 2
(SOM2).
As shown in Table 1, ad performance is overall best for loosely and strongly topic-related
ads. Both ad designs result in the highest number of user accounts reached, link clicks, survey
starts, and survey completions. The latter two metrics are highest for strongly topic-related ads.
These findings hold for both countries, except for the loosely topic-related metrics on survey
starts and completions in Germany. In addition, the number of user accounts reached and link
clicks is substantially higher for the USA than for Germany.
When it comes to recruitment costs the total costs are similarly high across ad designs,
but they are substantially higher for the USA. In Germany, the costs per link click differ between
13 Cents (strongly topic-related ads) and 20 Cents (neutral ads), whereas in the USA the costs
per link click differ between 26 Cents (loosely topic-related ads) and 51 Cents (neutral ads).
The costs per survey start and completion are highest for loosely topic-related ads in both
countries. However, in the USA, both metrics are extraordinarily high: 4.50 Euros (survey start)
and 17.37 Euros (survey completion). The reason is that two loosely topic-related ad images
only achieved very few survey starts and completions. When excluding these two ads the costs
per survey start and completion drop to 53 Cents and 1.16 Euros, respectively. Importantly, in
both countries, the costs for survey completions are lowest for strongly topic-related ads.
1
As we conducted the ad campaigns through a German Meta business account, we paid for all campaigns in
Euros, including the ones in the USA.
6
Table 1. Ad performance, recruitment costs, and user engagements for Germany and the USA
Germany
USA
Indicators
Loosely topic-
related ads
Strongly topic-
related ads
Neutral ads
Loosely topic-
related ads
Performance
Unique user accounts
reached
7,024
7,086
7,332
15,135
Unique link clicks
422
489
462
920
Survey starts
205
367
292
304
Survey completions
125
212
137
139
Costs
Costs total
60
59
232
230
Costs per unique user
account reached
0.01
0.01
0.03
0.02
Costs per unique link
click
0.15
0.13
0.51
0.26
Costs per survey start
0.93
0.18
0.80
4.50
Costs per survey
completion
0.46
0.29
1.70
17.37
Engagements
Number of likes
18
17
35
53
Number of reactions
9
63
7
5
Number of shares
2
5
3
6
Number of saves
1
1
3
2
Number of comments
8
37
13
17
7
Table 1. Continued
Germany
USA
Indicators
Loosely topic-
related ads
Strongly topic-
related ads
Neutral ads
Loosely topic-
related ads
Engagements
Positive survey
reactions
0
2
1
4
Negative survey
reactions
61
35
50
26
Topic reactions
21
50
36
63
Offensive content
5
7
4
5
Unrelated content and
nonsense
4
11
4
9
Images and GIFs
18
11
8
5
Note. Performance metrics are in absolute frequencies, costs are in Euros, and engagements are in absolute frequencies (first five categories) or percentages (last six categories).
Importantly, we report mean metrics across neutral, loosely, and strongly topic-related ads.
8
User engagements in terms of likes do not vary much across ad designs in Germany. In the
USA, in contrast, the number of likes increases from 35 (neutral ads) to 75 (strongly topic-
related ads). The number of reactions is highest for strongly topic-related ads in both countries.
In addition, the number of shares and saves are rather low in both countries with a peak of 13
shares for strongly topic-related ads in the USA. Finally, the number of comments is highest for
strongly topic-related ads in Germany and the USA. In the USA, however, the number of
comments is about twice as high as in Germany.
While there are almost no positive survey reactions across ad designs in both countries,
there is a high percentage of negative survey reactions. The percentage of topic reactions is
highest for loosely (USA) and strongly topic-related ads (Germany and USA). In contrast, in
Germany, neutral ads result in the highest percentage of unrelated content and nonsense. In both
countries, the percentage of offensive content is highest for strongly topic-related ads, while the
USA show an overall higher percentage of offensive content. Images and GIFs are most
common for loosely topic-related ads (Germany) and neutral ads (USA) with an overall higher
percentage in Germany.
Discussion and conclusion
This study aimed to assess the performance, recruitment costs, and user engagements of social
media ads used for recruiting online survey participants. To answer our three research questions,
we analyzed data from two Facebook-recruited online surveys on immigration and climate
change that were conducted in Germany and the USA. The ads featured images with a strong
relation, loose relation, or no relation (neutral design) to the online survey topic. Our findings
highlight not only the relevance of image selection for social media ads but also cross-national
recruitment differences.
Our first research question examined ad performance in terms of user accounts reached,
link clicks, survey starts, and survey completions. Social media ads with strongly topic-related
images convince through high campaign effectiveness, as they result in the highest number of
survey starts and completes. This similarly applies to Germany and the USA. One possible
explanation is that such images are more salient and resonate more effectively with users.
However, this can be problematic if some specific subgroups are more likely to resonate with
ads than some others, increasing the self-selection bias inherent to convenience or river
sampling approaches (Lehdonvirta et al., 2021). For example, ads with strongly topic-related
images on immigration may specially attract participants with preconceived and extreme
attitudes towards immigration, potentially leading to sample imbalances that limit substantive
conclusions. Since this claim lacks a solid empirical basis in our study, we recommend future
research to systematically examine sample composition variations across ad designs.
Regarding our second research question on recruitment costs we analyzed total costs,
costs per user account reached, link click, survey start, and survey completion. Our findings
align with those on ad performance. Strongly topic-related images yielded the most cost-
effective recruitment, producing the lowest cost per survey start and completion in both
Germany and the USA. Users with strongly preconceived or extreme attitudes may be more
likely to take part in online surveys dealing with specific topics. These participants might also
be more willing to complete online surveys, ensuring that their opinions are represented. This
underscores the importance of further research on the association between participant
9
characteristics and ad design. In particular, this applies to survey motivation and completion
rates.
Our third research question focused on user engagements with the ads including likes,
reactions, shares, saves, and comments. Strongly topic-related images generated the highest
number of reactions and comments in both countries. However, analyzing the content of the
comments retrieved from the corresponding Facebook ads revealed a substantially higher
percentage of negative survey reactions than positive survey reactions. In Germany, this is more
common for neutral and loosely topic-related images, whereas in the USA this is more common
for neutral images. In addition, topic reactions and offensive content are more common for
strongly topic-related images, with a higher level in the USA than in Germany. From our
perspective, these findings are of high value for survey researchers and practitioners
considering online survey recruitment through social media platforms. On the one hand, survey
and topic reactions can inform online survey design, field work strategies, and topic framing,
allowing for continuous improvement of online survey design. On the other hand, the presence
of offensive content highlights the need for active fieldwork monitoring and comment
moderation. We therefore strongly recommend that researchers allocate sufficient (human)
resources to detect and manage offensive content throughout data collection, particularly when
running large-scale, cross-national ad campaigns. In addition, offensive comments may serve
as early indicators of self-selection biases, suggesting that certain ads disproportionally attract
participants with preconceived or extreme attitudes.
While our study provides valuable insights, it has several limitations that offer directions
for future research. First, we exclusively focused on Facebook recruitment. Other social media
platforms, such as Instagram and TikTok, also offer advanced advertising and targeting tools.
We therefore recommend investigating ad performance, recruitment costs, and user
engagements across different platforms. Second, our study examined rather polarizing topics.
It remains unclear whether and to what extent our findings hold for less polarizing topics.
Investigating topic sensitivity effects on social media recruitment outcomes would be a valuable
next step. Finally, the performance, recruitment costs, and user engagements of ads are co-
dependent on Facebook’s internal algorithms. Although we held all aspects of our ad design
constant, except for the image, it is impossible to fully isolate the effect of the ad images from
Facebook’s internal ranking mechanisms.
By investigating ad performance, recruitment costs, and user engagements in a cross-
national research setting, this study contributes to a better understanding of online surveys
recruited through social media. Importantly, it shows that different ad designs are associated
with similar effectiveness and efficiency levels across Germany and the USA. In addition, we
could show that analyzing the comments of users is a crucial endeavor, as we found evidence
for offensive content in both countries. Even though ads with strongly topic-related images
convince through high performance and low recruitment costs, they come with higher ad
monitoring demands. We urge survey researchers and practitioners to take this trade-off into
account when designing social media ads for online survey recruitment.
10
References
Callegaro, M., Lozar Manfreda, K., & Vehovar, V. (2015). Web survey methodology. Sage.
https://doi.org/10.4135/9781529799651
Daikeler, J., Bosnjak, M., & Lozar Manfreda, K. (2020). Web versus other survey modes: An
updated and extended meta-analysis comparing response rates. Journal of Survey
Statistics and Methodology, 8, 513–539. https://doi.org/10.1093/jssam/smz008
Donzowa, J., Kühne, S., & Zindel, Z. (under review). From clicks to quality: Assessing
advertisement design’s impact on social media survey response quality. methods, data,
analyses.
Kühne, S. & Zindel, Z. (2020). Using Facebook and Instagram to recruit web survey
participants: a step-by-step guide and application. Survey Methods: Insights from the
Field. https://surveyinsights.org/?p=13558
Lehdonvirta, V., Oksanen, A., Räsänen, P., & Blank, G. (2021). Social media, web, and panel
surveys: Using non-probability samples in social and policy research. Policy and Internet,
13, 134–155. https://doi.org/10.1002/poi3.238
Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys
versus other survey modes: A meta-analysis comparing response rates. Journal of the
Market Research Society, 50, 79–104. https://doi.org/10.1177/147078530805000107
Pötzschke, S., Weiß, B., Daikeler, J., Silber, H., & Beuthner, C. (2023). A guideline on how to
recruit respondents for online surveys using Facebook and Instagram: Using hard-to-
reach health workers as an example. GESIS Survey Guidelines.
https://doi.org/10.15465/gesis-sg_en_045
Revilla, M., & Höhne, J. K. (2020). Comparing the participation of Millennials and older age
cohorts in the CROss-National Online Survey panel and the German Internet Panel.
Survey Research Methods, 14, 499–513. https://doi.org/10.18148/srm/2020.v14i5.7619
Schober, M. F. (2018). The future of face-to-face interviewing. Quality Assurance in Education,
26, 290–302. https://doi.org/10.1108/QAE-06-2017-0033
Schonlau, M., & Couper, M. P. (2017). Options for conducting web surveys. Statistical Science,
32, 279–292. https://doi.org/10.1214/16-STS597
Zindel, Z. (2023). Social media recruitment in online survey research: A systematic literature
review. methods, data, analyses, 17, 207–248. https://doi.org/10.12758/MDA.2022.15
Zindel, Z., Kühne, S., Zagheni, E., & Perrotta, D. (2023). Polarizing ads to recruit survey
respondents: A comparative study of Facebook ads and their impact on sample
composition. European Survey Research Association 2023 Conference, July 17-21,
Milan, Italy.
11
Supplementary Online Material (SOM) 1
Figure SOM1. All remaining images used for the Facebook ads
Note. The ads included the following text: “Your opinion is wanted! Take our short online survey on the diversity
of attitudes and opinions in the USA and be part of a nationwide study!”
12
Supplementary Online Material 2
Table SOM2. German and USA sample characteristics
Germany
USA
Female
42
42
Age
59
73
Tertiary education
52
48
Secondary education
48
52
Mobile device participation
90
79
N
2,495
2,520
Note. We report percentages for female, tertiary education, secondary education, and mobile device participation,
including smartphones and tablets. For age in years, in contrast, we report means. In both countries, the percentage
for primary education is below 1%.
ResearchGate has not been able to resolve any citations for this publication.
Preprint
Full-text available
Social Networking Sites (SNS) offer survey scientists a relatively new tool to recruit participants, especially among otherwise hard-to-reach populations. Facebook and Instagram, in particular, allow the distribution of advertisements to specific subsets of their users at low cost. Researchers can use such targeted advertisements to guide participants to their online questionnaires. In recent years, an increasing number of studies have shown that this approach can be successfully applied to a range of different target groups. However, a certain familiarity with the tools and mechanisms provided by Meta is necessary to employ this sampling method. Therefore, in this guideline, we will first give a general introduction to sampling via advertisements on Facebook and Instagram before providing detailed instructions on the implementation of such a recruitment campaign. This will be followed by a brief summary of a recent study conducted by GESIS using Meta's platforms to recruit professionals in the German health care sector. Finally, we provide recommendations with respect to the reporting of methodological parameters when using this approach, propose a flowchart to visualize sample sizes at different points during the recruitment process and offer a glossary containing definitions of essential terms researchers are confronted with when using Meta's advertisement interface.
Book
Full-text available
Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research. The book is now available as open access since at this Sage page in PDF and Epub version https://study.sagepub.com/web-survey-methodology
Article
Full-text available
Do web surveys still yield lower response rates compared with other survey modes? To answer this question, we replicated and extended a meta-analysis done in 2008 which found that, based on 45 experimental comparisons, web surveys had an 11 percentage points lower response rate compared with other survey modes. Fundamental changes in internet accessibility and use since the publication of the original meta-analysis would suggest that people’s propensity to participate in web surveys has changed considerably in the meantime. However, in our replication and extension study, which comprised 114 experimental comparisons between web and other survey modes, we found almost no change: web surveys still yielded lower response rates than other modes (a difference of 12 percentage points in response rates). Furthermore, we found that prenotifications, the sample recruitment strategy, the survey’s solicitation mode, the type of target population, the number of contact attempts, and the country in which the survey was conducted moderated the magnitude of the response rate differences. These findings have substantial implications for web survey methodology and operations.
Article
Full-text available
The use of online surveys has grown rapidly in social science and policy research, surpassing more established methods. We argue that a better understanding is needed, especially of the strengths and weaknesses of non‐probability online surveys, which can be conducted relatively quickly and cheaply. We describe two common approaches to non‐probability online surveys—river and panel sampling—and theorize their inherent selection biases: namely, topical self‐selection and economic self‐selection. We conduct an empirical comparison of two river samples (Facebook and web‐based sample) and one panel sample (from a major survey research company) with benchmark data grounded in a comprehensive population registry. The river samples diverge from the benchmark on demographic variables and yield much higher frequencies on non‐demographic variables, even after demographic adjustments; we attribute this to topical self‐selection. The panel sample is closer to the benchmark. When examining the characteristics of a non‐demographic subpopulation, we detect no differences between the river and panel samples. We conclude that non‐probability online surveys do not replace probability surveys, but augment the researcher's toolkit with new digital practices, such as exploratory studies of small and emerging non‐demographic subpopulations.
Article
Full-text available
One question that arises when discussing the usefulness of web-based surveys is whether they gain the same response rates compared to other modes of collecting survey data. A common perception exists that, in general, web survey response rates are considerably lower. However, such unsystematic anecdotal evidence could be misleading and does not provide any useful quantitative estimate. Metaanalytic procedures synthesising controlled experimental mode comparisons could give accurate answers but, to the best of the authors' knowledge, such research syntheses have so far not been conducted. To overcome this gap, the authors have conducted a meta-analysis of 45 published and unpublished experimental comparisons between web and other survey modes. On average, web surveys yield an 11% lower response rate compared to other modes (the 95% confidence interval is confined by 15% and 6% to the disadvantage of the web mode). This response rate difference to the disadvantage of the web mode is systematically influenced by the sample recruitment base (a smaller difference for panel members as compared to one-time respondents), the solicitation mode chosen for web surveys (a greater difference for postal mail solicitation compared to email) and the number of contacts (the more contacts, the larger the difference in response rates between modes). No significant influence on response rate differences can be revealed for the type of mode web surveys are compared to, the type of target population, the type of sponsorship, whether or not incentives were offered, and the year the studies were conducted. Practical implications are discussed.
Article
The growing percentage of the population on social media creates new and expanded opportunities for survey researchers. Recently, a growing number of studies have been using social media to recruit survey respondents. Many social media platforms have powerful targeting capabilities that can be used to recruit even rare or hard-to-reach populations. However, thus far, the survey research literature lacks a comprehensive overview of potentials and limitations. This literature review aims 1) to provide an overview of the current literature on the use of social media as a recruitment tool, 2) to highlight the potential advantages and disadvantages for survey research, 3) to identify current research gaps, and finally, 4) to provide practical guidance for researchers interested in integrating social media recruitment into their research.
Article
In many countries and contexts, survey researchers are facing decreasing response rates and increasing survey costs. Data collection is even more complex and expensive when rare or hard-to-reach populations are to be sampled and surveyed. In such cases alternative sampling and recruiting approaches are usually needed, including non-probability and online convenience sampling. A rather novel approach to recruiting rare populations for online and mobile-device surveys uses advertisements on social media networks. This paper provides a step-by-step guide on how to recruit web-survey participants via ads on Facebook and Instagram – two of the largest social networks worldwide. Researchers may use this paper as a starting point for setting up their own recruiting campaigns. Moreover, the paper describes the results of fieldwork for a research project in which lesbian, gay, bisexual, transsexual, and queer (LGBTQ) web-survey participants in Germany were recruited via ads on social media.
Article
Millennials (1982 to 2003) witnessed a set of events during their lives that differentiated them from older age cohorts (Generation X, Boomers, and Silents). Thus, one can also expect that Millennials’ web survey participation differs from that of older cohorts. The goal of this study is to compare Millennials to older cohorts on different aspects related to web survey participation: participation rates, break-off rates, smartphone participation, survey evaluation, and data quality. We use data from two online probability-based panels covering four countries: the CROss-National Online Survey (CRONOS) panel in Estonia, Slovenia, and the UK, and the German Internet Panel (GIP). We find significantly lower participation rate for Millennials than for older cohorts and higher break-off rate for Millennials than for older cohorts in two countries. Smartphone participation is significantly higher for Millennials than for Generation X and Boomers in three countries. Comparing Millennials and Silents, we find that Millennials’ smartphone participation is significantly higher in two countries. There are almost no differences in survey evaluation and data quality across age cohorts in the descriptive analyses, but some age cohort effects in regression analyses. These results suggest that it is necessary to develop new strategies to encourage Millennials’ participation in online surveys.
Article
Purpose This paper explores the likelihood that face-to-face (FTF) interviewing will continue to be the "gold standard" survey interviewing method to which all other modes are compared, in an era in which daily communicative habits for many now involve selecting among many alternative modes. Design/methodology/approach After outlining what is known about the purported benefits and drawbacks of FTF interviewing, the paper reviews recent findings that raise questions about whether FTF interviewing still produces the highest rates of participation, the best data quality, and the greatest respondent satisfaction. Findings Results of several studies suggest that, at least for some respondents, asynchronous interviewing modes that reduce the interviewer's social presence and allow respondents to participate while they are mobile or multitasking (in particular, text messaging) may well lead to higher quality data and greater respondent satisfaction. Practical implications To the extent that these findings generalize, the implication is that FTF interviewing will continue to be needed for at least some respondents, but multiple trends suggest that it is likely to be one mode among many, and that the assumption that it is always needed or that it always leads to the highest quality data no longer holds. Originality/value Exploring when and how FTF interviewing will continue to be needed is particularly important given FTF's financial and social costs, in an era of budgetary challenges and new questioning about which data sources are essential and lead to trustworthy information.
Article
Web surveys can be conducted relatively fast and at relatively low cost. However, Web surveys are often conducted with nonprobability samples and, therefore, a major concern is generalizability. There are two main approaches to address this concern: One, find a way to conduct Web surveys on probability samples without losing most of the cost and speed advantages (e.g., by using mixed-mode approaches or probability-based panel surveys). Two, make adjustments (e.g., propensity scoring, post-stratification, GREG) to nonprobability samples using auxiliary variables. We review both of these approaches as well as lesser-known ones such as respondent-driven sampling. There are many different ways Web surveys can solve the challenge of generalizability. Rather than adopting a one-size-fits-all approach, we conclude that the choice of approach should be commensurate with the purpose of the study.