Content uploaded by Joshua Claassen
Author content
All content in this area was uploaded by Joshua Claassen on Mar 20, 2025
Content may be subject to copyright.
1
Social Media Ads for Survey Recruitment: Performance, Costs, User Engagement
1
Jan Karem Höhne
German Centre for Higher Education Research and Science Studies (DZHW)
Leibniz University Hannover
Joshua Claassen
German Centre for Higher Education Research and Science Studies (DZHW)
Leibniz University Hannover
Simon Kühne
Bielefeld University
Zaza Zindel
German Centre for Integration and Migration Research (DeZIM)
Abstract
Survey data collection is shifting towards the digital world, including online surveys
successively replacing other well-established survey data collection methods. However, online
surveys often result in low response rates and face other recruitment limitations, especially
when it comes to rare populations. Thus, survey researchers and practitioners increasingly
consider social media platforms for recruiting participants. Although social media recruitment
provides access to an untapped and diverse participant pool, there are no best-practices or
protocols ensuring sound participant recruitment. In this study, we contribute to the current state
of research by investigating the effects of social media ad design on ad performance (e.g.,
survey completions), recruitment costs (e.g., costs per survey completion), and platform user
engagements (e.g., number of comments). This is done in a cross-national research setting using
social media ads that were launched in Germany and the USA in 2023. Our findings show
differences across ad designs and countries. Although costs per survey completion are
substantially higher for the USA, ads with strongly topic-related images result in the lowest
costs per survey completion in Germany and the USA. At the same time, ads with strongly
topic-related images result in the highest number of comments in both countries. Finally, our
study provides novel insights into the design of social media ads for online survey recruitment
in a cross-national research setting and offers valuable information on their effectiveness,
efficiency, and monitoring and moderating demand.
Keywords: Cross-national study, Facebook, Nonprobability samples, Social media, Online
survey recruitment, Digital trace data
This document is a preprint and thus it may differ from the final version.
2
Introduction and research questions
With the rise of online surveys, the collection of survey data has undergone a major
transformation. Compared to other established survey methods, such as in-person and telephone
interviews, online surveys offer significant advantages in terms of cost- and time-efficiency
(Callegaro et al., 2015; Schober, 2018). However, they also present new challenges, including
low response rates (Daikeler et al., 2020; Lozar Manfreda et al., 2008) and sampling biases
(Revilla & Höhne, 2020). Participant recruitment remains especially difficult when no sampling
frames exist (Schonlau & Couper, 2017), forcing both survey researchers and practitioners to
rely on non-probability samples, such as online access panels or river sampling approaches.
While these methods are much more affordable than probability-based sampling, they still
require substantial financial and time investments prior to fieldwork and data collection. For
instance, this includes finding and hiring an online panel, reconciling questionnaire
programming and testing, setting specific thresholds when it comes to participant
characteristics, and transferring fieldwork metrics and survey data.
Social media platforms, such as Facebook, present a promising way for efficiently
recruiting online survey participants (see Zindel (2023) for a detailed review of existing
research). Social media recruitment is based on digital ads – comprising a brief text, an image
or video, and an online survey link – to invite platform users to participate in online surveys.
These ads can be launched at any time without much time in advance (Pötzschke et al., 2023).
When clicking on the link in the ad, users are directed to the online survey for participation.
Importantly, ads on these platforms can be targeted to specific participant groups, creating a
form of quota sampling. Thanks to the billions of platform users, social media recruitment offers
an easy and efficient way to access a broad and diverse group of participants, including hard-
to-reach or uncommon populations, such as individuals with specific health conditions or sexual
minorities (Kühne & Zindel, 2020; Pötzschke et al., 2023; Zindel, 2023).
Despite growing interest in social media recruitment, best practices for ad design remain
unclear and there is a lack of empirical-driven fieldwork guidelines or protocols. In particular,
little is known about whether and to what extent ads should include images that are thematically
related to the online survey topic or not (Donzowa et al., under review). This is accompanied
by a lack of studies informing about ad performance (e.g., survey completions) and recruitment
costs (e.g., costs per survey completion). Moreover, existing research has largely ignored user
engagements with social media ads (e.g., number or content of comments). This is surprising
for at least three reasons. First, user engagements are likely to affect ad performance, by
encouraging or discouraging other users to click on the ad for online survey participation.
Second, user engagements with ads need to be monitored throughout the fieldwork (Kühne &
Zindel, 2020) to, for instance, detect (and delete) hateful comments. Third, the quantity (e.g.,
number of comments) and quality (e.g., disapproval of the survey topic) of user engagements
may serve as a control measure for the recruitment campaign and potentially signal problematic
ads or other fieldwork issues. Finally, only few studies have examined social media recruitment
across countries (Donzowa et al., under review; Zindel et al., 2023). One exception is the cross-
national study by Donzowa et al. (under review) reporting that the design of social media ads
indeed affects data quality in terms of attrition, item-nonresponse, and non-differentiation.
This study addresses the previously described gaps in the survey literature by
investigating social media recruitment in a cross-national context. Using data from two identical
3
online surveys that were simultaneously conducted through Facebook ads in Germany and the
USA in 2023, we examine 1) ad performance, 2) recruitment costs, and 3) user engagements
with ads across different image designs. The online surveys dealt with immigration and climate
change and were promoted through images with a strong relation, loose relation, or no relation
(neutral design) to these topics. Unlike previous studies, we also analyze social media
engagement data, such as ad-related comments. We address the following three research
questions (RQs):
How do strongly topic-related, loosely topic-related, and neutral images in social media
ads in Germany and the USA differ with respect to …
… ad performance? (RQ1)
… recruitment costs? (RQ2)
… user engagement? (RQ3)
Addressing these research questions provides practical insights into the optimization of
ad design for social media campaigns in the context of participant recruitment. Specifically, our
findings contribute to better understanding campaign effectiveness (performance, RQ1),
campaign efficiency (recruitment costs, RQ2), and fieldwork monitoring and comment
moderation (user engagements, RQ3). By offering novel insights, this study serves as a basis
for future research on social media recruitment. In addition, its cross-national scope
distinguishes it from previous studies, making a valuable contribution to comparative survey
research.
Method
Data collection and participant recruitment
We conducted two self-administered online surveys, recruiting participants through Facebook
ads displayed in users’ newsfeed. The first online survey took place in Germany from June 25,
2023, to July 2, 2023, while the second one, featuring the same questions translated into
English, took place in the USA from June 25, 2023, to July 3, 2023. Both online surveys focused
on immigration and climate change. Each ad consisted of an image, a brief text-based invitation
to participate in the online survey, and a link directing users to the online survey. On the first
online survey page, participants received information about the study procedure, topic, expected
duration of the online survey, and adherence with existing data protection laws and regulations.
Participation was voluntary without the provision of any incentives.
Ad images
We tested three types of Facebook ads, which differed only with respect to the image used:
Images that were strongly related to immigration or climate change (strong topic-relation),
images that were loosely related to immigration or climate change (loose topic-relation), and
images with no relation to immigration or climate change (neutral design). In both countries,
we launched ads featuring six strongly topic-related images, six loosely topic-related images,
and three neutral images. The same images were used in Germany and the USA, except for two
images containing text (German vs. English) and two images featuring flags (EU vs. USA) that
were adapted for each country. Figure 1 presents examples of the ad images for Germany, the
4
USA, and both countries. The Supplementary Online Material 1 (SOM1) includes an overview
of the remaining images used in this study.
We utilized a three-step approach to select the ad images (see Donzowa et al., under
review). First, we compiled an initial set of 73 images from AdobeStock
(https://stock.adobe.com/) and iStock (https://www.istockphoto.com/). Second, we conducted
a survey among expert researchers in our scientific network, asking them to classify each image
as strongly topic-related, loosely topic-related, or not topic-related. Third, we conducted a social
media campaign pretest using the images in Facebook ads for participant recruitment. To assess
image performance, we calculated the ratio of unique user accounts reached to link clicks for
each ad. Based on these results, we selected the top six strongly topic-related, the top six loosely
topic-related, and the top three neutral images for the final study.
Figure 1. Examples of the images used for the Facebook ads
Note. The ads included the following text (USA version): “Your opinion is wanted! Take our short online survey
on the diversity of attitudes and opinions in the USA and be part of a nationwide study!”
Analysis
In a first step, we analyze key ad campaign and online survey metrics: ad performance in terms
of numbers of unique user accounts reached, unique link clicks, survey starts, and survey
completions as well as recruitment costs in terms of total costs, costs per unique user account
reached, unique link click, survey start, and survey completion. Moreover, we look at user
engagements with ads in terms of number of likes (e.g., thumb ups), reactions (e.g., emojis),
shares (e.g., with platform friends), saves (e.g., for revisiting at a later point), and comments
(e.g., on the online survey topic). Importantly, for each of these metrics, we report mean values
across neutral, loosely, and strongly topic-related ads.
In addition, we retrieved the comments from all Facebook ads using the Meta Graph API,
yielding 301 comments from Germany and 578 comments from the USA. To this end, we
authenticated with the API using a personalized access token to retrieve ad content. We then
sent a GET request to the appropriate endpoint to obtain comments associated with the specific
ads. In return, we retrieved structured data in the JSON format. The retrieved comments were
5
manually coded by a student assistant: 1) “Positive survey reactions” (e.g., “Excellent survey”),
2) “Negative survey reactions” (e.g., “This survey is just a scam”), 3) “Topic reactions” (e.g.,
“Our country is doomed”), 4) “Offensive content” (e.g., “Fuck you liberal assholes”), and 5)
“Unrelated content and nonsense” (e.g., “Uh oh!”). For comments that exclusively contained
images or graphic interchange formats (GIFs) instead of text, we used an additional code: 6)
“Images and GIFs.” Importantly, we utilized an inductive coding approach and developed the
coding scheme based on the data instead of using preconceived codes. If a comment matched
multiple categories, multiple codes were assigned. The second author re-coded a random subset
of 10% (n = 88) of the comments. In case of disagreement, the second author and the student
assistant reached a consensus through discussion, resulting in refined coding instructions. Based
on the refined instructions, the student assistant went through all comments again and assigned
the final codes.
Given the comparative nature of the study, we report results descriptively using
frequencies, costs (Euro), and percentages.
Results
In Germany, 71,609 user accounts were reached by the online survey ads. Of these, 5,720 users
clicked on the link and visited the first online survey page. In total, 4,170 participants started
the online survey by proceeding to the second page, and 2,495 participants completed the entire
online survey. The overall costs for the Facebook ads in Germany were 890 Euros. In the USA,
in contrast, 149,183 user accounts were reached by the online survey ads. Of these, 10,832 users
clicked on the link and visited the first online survey page. In total, 5,469 participants started
the online survey by proceeding to the second page and 2,520 participants completed the entire
online survey. The overall costs for the Facebook ads in the USA were 3,457 Euros
1
. We provide
further information on the German and USA samples in the Supplementary Online Material 2
(SOM2).
As shown in Table 1, ad performance is overall best for loosely and strongly topic-related
ads. Both ad designs result in the highest number of user accounts reached, link clicks, survey
starts, and survey completions. The latter two metrics are highest for strongly topic-related ads.
These findings hold for both countries, except for the loosely topic-related metrics on survey
starts and completions in Germany. In addition, the number of user accounts reached and link
clicks is substantially higher for the USA than for Germany.
When it comes to recruitment costs the total costs are similarly high across ad designs,
but they are substantially higher for the USA. In Germany, the costs per link click differ between
13 Cents (strongly topic-related ads) and 20 Cents (neutral ads), whereas in the USA the costs
per link click differ between 26 Cents (loosely topic-related ads) and 51 Cents (neutral ads).
The costs per survey start and completion are highest for loosely topic-related ads in both
countries. However, in the USA, both metrics are extraordinarily high: 4.50 Euros (survey start)
and 17.37 Euros (survey completion). The reason is that two loosely topic-related ad images
only achieved very few survey starts and completions. When excluding these two ads the costs
per survey start and completion drop to 53 Cents and 1.16 Euros, respectively. Importantly, in
both countries, the costs for survey completions are lowest for strongly topic-related ads.
1
As we conducted the ad campaigns through a German Meta business account, we paid for all campaigns in
Euros, including the ones in the USA.
6
Table 1. Ad performance, recruitment costs, and user engagements for Germany and the USA
Germany
USA
Indicators
Neutral ads
Loosely topic-
related ads
Strongly topic-
related ads
Neutral ads
Loosely topic-
related ads
Strongly topic-
related ads
Performance
Unique user accounts
reached
5,663
7,024
7,086
7,332
15,135
12,881
Unique link clicks
298
422
489
462
920
870
Survey starts
246
205
367
292
304
462
Survey completions
158
125
212
137
139
213
Costs
Costs total
59
60
59
232
230
230
Costs per unique user
account reached
0.01
0.01
0.01
0.03
0.02
0.02
Costs per unique link
click
0.20
0.15
0.13
0.51
0.26
0.28
Costs per survey start
0.24
0.93
0.18
0.80
4.50
0.53
Costs per survey
completion
0.38
0.46
0.29
1.70
17.37
1.15
Engagements
Number of likes
14
18
17
35
53
75
Number of reactions
16
9
63
7
5
37
Number of shares
3
2
5
3
6
13
Number of saves
0
1
1
3
2
2
Number of comments
11
8
37
13
17
73
7
Table 1. Continued
Germany
USA
Indicators
Neutral ads
Loosely topic-
related ads
Strongly topic-
related ads
Neutral ads
Loosely topic-
related ads
Strongly topic-
related ads
Engagements
Positive survey
reactions
0
0
2
1
4
2
Negative survey
reactions
55
61
35
50
26
25
Topic reactions
21
21
50
36
63
67
Offensive content
3
5
7
4
5
19
Unrelated content and
nonsense
30
4
11
4
9
10
Images and GIFs
2
18
11
8
5
1
Note. Performance metrics are in absolute frequencies, costs are in Euros, and engagements are in absolute frequencies (first five categories) or percentages (last six categories).
Importantly, we report mean metrics across neutral, loosely, and strongly topic-related ads.
8
User engagements in terms of likes do not vary much across ad designs in Germany. In the
USA, in contrast, the number of likes increases from 35 (neutral ads) to 75 (strongly topic-
related ads). The number of reactions is highest for strongly topic-related ads in both countries.
In addition, the number of shares and saves are rather low in both countries with a peak of 13
shares for strongly topic-related ads in the USA. Finally, the number of comments is highest for
strongly topic-related ads in Germany and the USA. In the USA, however, the number of
comments is about twice as high as in Germany.
While there are almost no positive survey reactions across ad designs in both countries,
there is a high percentage of negative survey reactions. The percentage of topic reactions is
highest for loosely (USA) and strongly topic-related ads (Germany and USA). In contrast, in
Germany, neutral ads result in the highest percentage of unrelated content and nonsense. In both
countries, the percentage of offensive content is highest for strongly topic-related ads, while the
USA show an overall higher percentage of offensive content. Images and GIFs are most
common for loosely topic-related ads (Germany) and neutral ads (USA) with an overall higher
percentage in Germany.
Discussion and conclusion
This study aimed to assess the performance, recruitment costs, and user engagements of social
media ads used for recruiting online survey participants. To answer our three research questions,
we analyzed data from two Facebook-recruited online surveys on immigration and climate
change that were conducted in Germany and the USA. The ads featured images with a strong
relation, loose relation, or no relation (neutral design) to the online survey topic. Our findings
highlight not only the relevance of image selection for social media ads but also cross-national
recruitment differences.
Our first research question examined ad performance in terms of user accounts reached,
link clicks, survey starts, and survey completions. Social media ads with strongly topic-related
images convince through high campaign effectiveness, as they result in the highest number of
survey starts and completes. This similarly applies to Germany and the USA. One possible
explanation is that such images are more salient and resonate more effectively with users.
However, this can be problematic if some specific subgroups are more likely to resonate with
ads than some others, increasing the self-selection bias inherent to convenience or river
sampling approaches (Lehdonvirta et al., 2021). For example, ads with strongly topic-related
images on immigration may specially attract participants with preconceived and extreme
attitudes towards immigration, potentially leading to sample imbalances that limit substantive
conclusions. Since this claim lacks a solid empirical basis in our study, we recommend future
research to systematically examine sample composition variations across ad designs.
Regarding our second research question on recruitment costs we analyzed total costs,
costs per user account reached, link click, survey start, and survey completion. Our findings
align with those on ad performance. Strongly topic-related images yielded the most cost-
effective recruitment, producing the lowest cost per survey start and completion in both
Germany and the USA. Users with strongly preconceived or extreme attitudes may be more
likely to take part in online surveys dealing with specific topics. These participants might also
be more willing to complete online surveys, ensuring that their opinions are represented. This
underscores the importance of further research on the association between participant
9
characteristics and ad design. In particular, this applies to survey motivation and completion
rates.
Our third research question focused on user engagements with the ads including likes,
reactions, shares, saves, and comments. Strongly topic-related images generated the highest
number of reactions and comments in both countries. However, analyzing the content of the
comments retrieved from the corresponding Facebook ads revealed a substantially higher
percentage of negative survey reactions than positive survey reactions. In Germany, this is more
common for neutral and loosely topic-related images, whereas in the USA this is more common
for neutral images. In addition, topic reactions and offensive content are more common for
strongly topic-related images, with a higher level in the USA than in Germany. From our
perspective, these findings are of high value for survey researchers and practitioners
considering online survey recruitment through social media platforms. On the one hand, survey
and topic reactions can inform online survey design, field work strategies, and topic framing,
allowing for continuous improvement of online survey design. On the other hand, the presence
of offensive content highlights the need for active fieldwork monitoring and comment
moderation. We therefore strongly recommend that researchers allocate sufficient (human)
resources to detect and manage offensive content throughout data collection, particularly when
running large-scale, cross-national ad campaigns. In addition, offensive comments may serve
as early indicators of self-selection biases, suggesting that certain ads disproportionally attract
participants with preconceived or extreme attitudes.
While our study provides valuable insights, it has several limitations that offer directions
for future research. First, we exclusively focused on Facebook recruitment. Other social media
platforms, such as Instagram and TikTok, also offer advanced advertising and targeting tools.
We therefore recommend investigating ad performance, recruitment costs, and user
engagements across different platforms. Second, our study examined rather polarizing topics.
It remains unclear whether and to what extent our findings hold for less polarizing topics.
Investigating topic sensitivity effects on social media recruitment outcomes would be a valuable
next step. Finally, the performance, recruitment costs, and user engagements of ads are co-
dependent on Facebook’s internal algorithms. Although we held all aspects of our ad design
constant, except for the image, it is impossible to fully isolate the effect of the ad images from
Facebook’s internal ranking mechanisms.
By investigating ad performance, recruitment costs, and user engagements in a cross-
national research setting, this study contributes to a better understanding of online surveys
recruited through social media. Importantly, it shows that different ad designs are associated
with similar effectiveness and efficiency levels across Germany and the USA. In addition, we
could show that analyzing the comments of users is a crucial endeavor, as we found evidence
for offensive content in both countries. Even though ads with strongly topic-related images
convince through high performance and low recruitment costs, they come with higher ad
monitoring demands. We urge survey researchers and practitioners to take this trade-off into
account when designing social media ads for online survey recruitment.
10
References
Callegaro, M., Lozar Manfreda, K., & Vehovar, V. (2015). Web survey methodology. Sage.
https://doi.org/10.4135/9781529799651
Daikeler, J., Bosnjak, M., & Lozar Manfreda, K. (2020). Web versus other survey modes: An
updated and extended meta-analysis comparing response rates. Journal of Survey
Statistics and Methodology, 8, 513–539. https://doi.org/10.1093/jssam/smz008
Donzowa, J., Kühne, S., & Zindel, Z. (under review). From clicks to quality: Assessing
advertisement design’s impact on social media survey response quality. methods, data,
analyses.
Kühne, S. & Zindel, Z. (2020). Using Facebook and Instagram to recruit web survey
participants: a step-by-step guide and application. Survey Methods: Insights from the
Field. https://surveyinsights.org/?p=13558
Lehdonvirta, V., Oksanen, A., Räsänen, P., & Blank, G. (2021). Social media, web, and panel
surveys: Using non-probability samples in social and policy research. Policy and Internet,
13, 134–155. https://doi.org/10.1002/poi3.238
Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys
versus other survey modes: A meta-analysis comparing response rates. Journal of the
Market Research Society, 50, 79–104. https://doi.org/10.1177/147078530805000107
Pötzschke, S., Weiß, B., Daikeler, J., Silber, H., & Beuthner, C. (2023). A guideline on how to
recruit respondents for online surveys using Facebook and Instagram: Using hard-to-
reach health workers as an example. GESIS – Survey Guidelines.
https://doi.org/10.15465/gesis-sg_en_045
Revilla, M., & Höhne, J. K. (2020). Comparing the participation of Millennials and older age
cohorts in the CROss-National Online Survey panel and the German Internet Panel.
Survey Research Methods, 14, 499–513. https://doi.org/10.18148/srm/2020.v14i5.7619
Schober, M. F. (2018). The future of face-to-face interviewing. Quality Assurance in Education,
26, 290–302. https://doi.org/10.1108/QAE-06-2017-0033
Schonlau, M., & Couper, M. P. (2017). Options for conducting web surveys. Statistical Science,
32, 279–292. https://doi.org/10.1214/16-STS597
Zindel, Z. (2023). Social media recruitment in online survey research: A systematic literature
review. methods, data, analyses, 17, 207–248. https://doi.org/10.12758/MDA.2022.15
Zindel, Z., Kühne, S., Zagheni, E., & Perrotta, D. (2023). Polarizing ads to recruit survey
respondents: A comparative study of Facebook ads and their impact on sample
composition. European Survey Research Association 2023 Conference, July 17-21,
Milan, Italy.
11
Supplementary Online Material (SOM) 1
Figure SOM1. All remaining images used for the Facebook ads
Note. The ads included the following text: “Your opinion is wanted! Take our short online survey on the diversity
of attitudes and opinions in the USA and be part of a nationwide study!”
12
Supplementary Online Material 2
Table SOM2. German and USA sample characteristics
Germany
USA
Female
42
42
Age
59
73
Tertiary education
52
48
Secondary education
48
52
Mobile device participation
90
79
N
2,495
2,520
Note. We report percentages for female, tertiary education, secondary education, and mobile device participation,
including smartphones and tablets. For age in years, in contrast, we report means. In both countries, the percentage
for primary education is below 1%.