ArticlePDF Available

Does Benefit Framing Improve Record Linkage Consent Rates? A Survey Experiment

Authors:
  • Institute for Employment Research / Ludwig Maximilian University of Munich / University of Mannheim

Abstract and Figures

Survey researchers are increasingly seeking opportunities to link interview data with administrative records. However, obtaining consent from all survey respondents (or certain subgroups) remains a barrier to performing record linkage in many studies. We experimentally investigated whether emphasizing different benefits of record linkage to respondents in a telephone survey of employee working conditions improves respondents' willingness to consent to linkage of employment administrative records relative to a neutral consent request. We found that emphasizing linkage benefits related to "time savings" yielded a small, albeit statistically significant , improvement in the overall linkage consent rate (86.0) relative to the neutral consent request (83.8 percent). The time savings argument was particularly effective among "busy" respondents. A second benefit argument related to "improved study value" did not yield a statistically significant improvement in the linkage consent rate (84.4 percent) relative to the neutral request. This benefit argument was also ineffective among the subgroup of respondents considered to be most likely to have a self-interest in the study outcomes. The article concludes with a brief discussion of the practical implications of these findings and offers suggestions for possible research extensions.
Content may be subject to copyright.
Survey Research Methods (2019)
Vol. 13, No.3, pp. 289-304
doi:10.18148/srm/2019.v13i3.7391
c
European Survey Research Association
ISSN 1864-3361
http://www.surveymethods.org
Does Benefit Framing Improve Record Linkage Consent Rates?
A Survey Experiment
Joseph W. Sakshaug
Institute for Employment Research (IAB),
Ludwig Maximilian University of Munich, and
University of Mannheim
Jens Stegmaier
Institute for Employment Research (IAB)
Mark Trappmann
Institute for Employment Research (IAB), and
University of Bamberg
Frauke Kreuter
Institute for Employment Research (IAB),
University of Mannheim, and
University of Maryland
Survey researchers are increasingly seeking opportunities to link interview data with adminis-
trative records. However, obtaining consent from all survey respondents (or certain subgroups)
remains a barrier to performing record linkage in many studies. We experimentally investi-
gated whether emphasizing dierent benefits of record linkage to respondents in a telephone
survey of employee working conditions improves respondents’ willingness to consent to link-
age of employment administrative records relative to a neutral consent request. We found that
emphasizing linkage benefits related to “time savings” yielded a small, albeit statistically sig-
nificant, improvement in the overall linkage consent rate (86.0) relative to the neutral consent
request (83.8 percent). The time savings argument was particularly eective among “busy”
respondents. A second benefit argument related to “improved study value” did not yield a
statistically significant improvement in the linkage consent rate (84.4 percent) relative to the
neutral request. This benefit argument was also ineective among the subgroup of respondents
considered to be most likely to have a self-interest in the study outcomes. The article concludes
with a brief discussion of the practical implications of these findings and oers suggestions for
possible research extensions.
Keywords: administrative data; framing; interviewer-respondent interaction; questionnaire
design; telephone survey
1 Introduction
Linking survey data with administrative records has be-
come an increasingly attractive option in the field of sur-
vey research. Several high-profile surveys, such as the UK
Household Longitudinal Study (Buck & McFall, 2012), the
US National Health Interview Survey (Parsons et al., 2014),
and the German Study “Labour Market and Social Secu-
rity” (PASS; Trappmann et al., 2019; Trappmann, Beste,
Bethmann, and Müller, 2013), engage in linking interview
data to one or more of a variety of administrative data types
(e.g. employment, education, health). Recently, several
high-profile committees and organizations have endorsed
record linkage as a way to improve ocial statistics and
evidence-based policymaking. For example, the US National
Contact information: Joseph W. Sakshaug, Institute for Employ-
ment Research, 104 Regensburger Straße, Nuremberg, Germany
90478 (joe.sakshaug@iab.de)
Academies of Sciences, Engineering, and Medicine (2017)
recommended that “federal statistical agencies redesign cur-
rent data collection eorts and estimation using multiple data
sources, adapt current statistical methods to combine data
sources, and . .. develop the new methods needed for de-
sign and analysis using multiple data sources (p.2).” The
US Commission on Evidence-Based Policymaking (2017)
advocated for greater use of record linkage, stating that it
“will be an increasingly vital aspect of the evidence-building
community’s capacity to meet future demand from policy-
makers (p.2). Further, the Longitudinal Studies Strategic
Review (2018) panel advised the UK’s Economic and Social
Research Council (ESRC) to “promote, facilitate, and nego-
tiate administrative data linkage for researchers .. . to meet
increasing demands for longitudinal information including at
local authority level and for particular subgroups (p.8).
Linking survey data with administrative data oers several
benefits to researchers, sponsors, and potentially to the sur-
vey respondents themselves. For researchers, record linkage
increases scientific opportunities and strengthens the value of
289
290 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
social science studies by making more information available
on the surveyed units. Administrative records often contain
important substantive variables recorded longitudinally over
the life course (e.g. un/employment durations, lifetime earn-
ings, medical expenditures). Many of these variables would
be dicult—or near-impossible—to collect with high accu-
racy from respondent self-reports alone. For sponsors of sur-
vey research, the ability to merge interview data with existing
administrative data is attractive from a cost perspective as it
enhances the data profile of survey respondents at a fraction
of the cost of primary data collection. Finally, for respon-
dents, if the questionnaire is designed with linkage in mind
then a shorter and more parsimonious interview can be con-
ducted, leading to time savings and reduced burden for re-
spondents. In some cases, interview duration and respondent
burden can be immediately reduced if linkage is oered as an
alternative to answering a subset of questionnaire items dur-
ing the interview (e.g. Michaud, Dolson, Adams, & Renaud,
1995).
Although the benefits of linking administrative data to sur-
vey data are appealing to many stakeholders, what is unclear
is the extent to which they resonate with the gatekeepers of
these data—the survey respondents—whose data are linked
and for whom authorization (or consent) for the linkage is
typically sought. The new European General Data Protec-
tion Regulations have given the informed consent process re-
newed visibility, for example, by forcing the private sector
to ask for consent to link data sources (General Data Pro-
tection Regulation, 2016). Researchers might be able to use
the “legitimate interest” clause as a legal basis for linking
records without asking for consent, though explicit consent
might legitimize the use of data, in particular for those with
more sensitive content1.
Convincing respondents that record linkage is a worthy
endeavor is not an easy task as evidenced by large vari-
ation in linkage consent rates across studies, disciplines,
and target populations (da Silva et al., 2012; Sakshaug &
Kreuter, 2012). Furthermore, Fulton (2012) reports a declin-
ing trend in linkage consent rates in long-running repeated
cross-sectional studies in the US, suggesting that respon-
dents’ perception of the benefits of linkage are potentially be-
ing outweighed by other factors (e.g. concerns about sharing
confidential data; see Sala, Knies, & Burton, 2014). How-
ever, trends in linkage consent rates for the National Health
Interview Survey have improved based on changes to the
linkage consent questions (Miller, Gindi, & Parker, 2011).
The purpose of the present study is to investigate the
impact of highlighting particular benefits of record linkage
on respondents’ willingness to consent to a linkage request.
Apart from a main eect, we also look at eects for sub-
groups that can be expected to be responsive to the particular
benefits mentioned, thus investigating the potential for tailor-
ing the linkage request.
2 Background
Administering survey requests in a way that makes salient
factors which are believed to be attractive to sample mem-
bers is a well-known strategy for improving survey partici-
pation rates (Groves, Singer, & Corning, 2000). Eorts to
improve linkage consent rates in surveys have adopted sim-
ilar strategies, namely, by emphasizing specific benefits of
linkage during the linkage request. The idea of framing a
data sharing request in terms of its expected benefits was ex-
perimentally tested in a nationally-representative telephone
and face-to-face study by Bates, Wroblewski, and Pascale
(2012). Respondents were asked to consider a hypothet-
ical proposal in which the US Census Bureau would col-
lect demographic information from government administra-
tive records for people who did not return their census forms.
Respondents were randomly allocated into separate framing
groups which emphasized dierent benefits of the proposal.
In one group, the expected cost savings incurred from uti-
lizing government records instead of collecting census forms
was emphasized. In the second group, the expected reduction
in respondent burden that would result from substituting in-
formation from government records in lieu of filling out and
mailing back a census form was emphasized. In the third
group (control), no benefits of data sharing were emphasized
in the proposal. The authors found that both benefit argu-
ments (cost savings and reduced burden) elicited more pos-
itive feelings towards the proposal compared to the control
group, with the cost savings argument yielding slightly more
positive feelings than the reduced burden argument.
In an actual application of obtaining linkage consent, Pas-
cale (2011) experimented with three benefit framing argu-
ments for the linkage consent question in a U.S. telephone
study conducted by the U.S. Census Bureau. The first ar-
gument stated that linking surveys to government adminis-
trative records would improve the accuracy of the research
results, the second argument emphasized the cost savings
that would result if the survey data were linked to govern-
ment records, and the last argument was that record linkage
would result in time savings for the respondent by produc-
ing additional statistical data “without taking up your time
with more questions.” After being read the benefit framing
arguments respondents were asked if they had any objections
to the linkage. Contrary to expectations, there were no sta-
tistically significant dierences in “no objection” rates be-
tween the three benefit arguments (cost savings: 85.3 per-
cent; time savings: 83.6 percent; improved accuracy: 83.0
percent). Sakshaug, Tutz, and Kreuter (2013) also report the
1Race, ethnic origin, politics, religion, and trade union
membership are examples of special category data where explicit
consent is advised, see also https://ico.org.uk/for-organisations/
guide-to-the-general-data-protection-regulation-gdpr/
ful-basis-for-processing/special-category-data/.
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 291
lack of a benefit framing eect in a linkage consent exper-
iment embedded within a telephone study in Germany. Re-
spondents in the benefit condition were presented with a time
savings argument (“To keep the interview as brief as pos-
sible. . . ”) whereas respondents in the complementary con-
dition received no benefit argument. Both versions yielded
virtually identical consent rates (time savings: 95.5 percent;
neutral: 95.6 percent). However, a subsequent replication of
this experiment in a web survey yielded a positive, though
modest, eect of the time savings argument (time savings:
61.4 percent; neutral: 55.4 percent) (Sakshaug & Kreuter,
2014).
Rather than emphasizing the benefits that would be gained
if linkage consent were obtained, an alternative benefit fram-
ing strategy is to emphasize the benefits of linkage that would
be unrealized if consent is not given. Kreuter, Sakshaug,
and Tourangeau (2016) experimented with this strategy in
a U.S. telephone study by randomly assigning respondents
to a gain-framing condition and a loss-framing condition. In
the gain-framing condition, respondents were told that the
information that they provided in the survey would be “a lot
more valuable” if it could be linked to administrative records.
In the loss-framing condition, respondents were instead told
that their survey information would be “much less valuable”
if it could not be linked to records. In line with the social
psychological literature on gain and loss framing (Kahneman
& Tversky, 1979,1984), the loss-framing condition yielded
a higher consent rate (66.8 percent) than the gain-framing
condition (56.1 percent). However, attempts to replicate
these results in telephone studies in Germany have yielded
no statistically significant dierences between gain- and loss-
framing conditions (Kreuter, Sakshaug, Schmucker, Singer,
& Couper, 2015; Sakshaug, Wolter, & Kreuter, 2015).
3 Research Gaps and Hypotheses
The above literature review suggests that the strategy of
emphasizing the benefits of record linkage to survey respon-
dents does not consistently improve linkage consent rates
over a neutral framing strategy. However, the evidence base
from which this conclusion is drawn is based on a very small
number of studies. Even fewer studies of actual linkage con-
sent have experimentally tested the eectiveness of a benefit
framing argument against a neutral comparison group (Sak-
shaug & Kreuter, 2014; Sakshaug et al., 2013). A limitation
of omitting a neutral framing condition is that even if no dif-
ferences are observed between two (or more) benefit framing
arguments, both arguments could still be more eective than
no benefit argument. Although the two studies cited above
included a control group in their experiments, they consid-
ered only one benefit argument (time savings). To date, no
linkage consent experiment has simultaneously tested multi-
ple benefit framing arguments with comparison to a neutral
control group.
Furthermore, no study to date has investigated whether
linkage consent rates might be improved by tailoring the
wording of the linkage consent question to attributes of indi-
vidual respondents. In line with the general survey participa-
tion literature (Groves et al., 2000), it is highly plausible that
dierent respondents place dierent importance on features
of the linkage consent request. The survey introduction or
questions asked prior to the linkage consent question might
then take the role that the doorstep interaction takes in the
process of gaining cooperation to help identify who can be
persuaded by which argument (Groves & McGonagle, 2001).
Thus, concerning the two experimental conditions tested in
the present study (time savings and improved value of the
study) some past findings about tailoring the survey request
or refusal conversion might carry over to the linkage consent
request (Groves & Couper, 1998; Morton-Williams, 1993).
For example, the argument of time savings should be at-
tractive for those who are busy. There is evidence that busy-
ness (or claiming to be busy) is related to survey nonresponse
(Bates, Dahlhamer, & Singer, 2008; Vercruyssen, Roose, &
van de Putte, 2011; Vercruyssen, van de Putte, & Stoop,
2011) and rushing through the questionnaire (Dahlhamer,
Simile, & Taylor, 2008), and one of the standard arguments
of interviewers is to emphasize that the interview will not
take long (Groves & McGonagle, 2001). Thus, we assume
that the benefit argument of time savings should work best
for people who are busy. The argument of an improved value
of the study should be more likely to attract those that have
a high interest in the study producing high quality data. In-
dependent of the study topic, this is generally assumed for
people with high education. They are more likely to partic-
ipate in scientific surveys and this is attributed to their in-
creased commitment to the value of scientific investigation
(Goyder, 1985). Thus, highly educated respondents should
be more likely to respond to the increased study value ar-
gument. Additionally, persons who might personally ben-
efit from the study results might be more likely to see the
value of a higher quality study. The study at hand is about
workers’ rights. To respondents, it is communicated as being
about working conditions, working hours, and working time
requirements. Thus, respondents who experience bad work-
ing conditions are likely to have a self-interest in the study
being of improved value.
In this article, we address these research gaps and con-
tribute to the relatively scarce literature on inducing linkage
consent through benefit framing. The present study reports
the results of a linkage consent experiment embedded within
an employee survey on workers’ rights in which respondents
were randomly allocated to one of three conditions, including
two conditions which highlight a particular benefit of record
linkage (time savings and improved value of the study) and a
neutral control condition. Specifically, we examine whether
there is a potential advantage of emphasizing dierent bene-
292 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
fits of linkage on the linkage consent rate and whether sub-
groups of respondents that can be identified from their pre-
vious responses react dierently to the benefits that are em-
phasized.
Based on the above arguments, the following hypotheses
are tested:
1. The time savings benefit argument should lead to an
overall higher linkage consent rate than the neutral
condition that does not specify a particular benefit of
linkage;
2. The improved study value benefit argument should
lead to an overall higher linkage consent rate than the
neutral condition that does not specify a particular ben-
efit of linkage;
3. The time savings argument should have a more posi-
tive eect on busy respondents;
4. The improved study value argument should have a
more positive eect for respondents with a high level
of education and for those with a self-interest in the
study being of improved value.
4 Data and Methods
4.1 Survey Data Collection
The linkage consent experiment was implemented on
a sample of employees who participated in the “Diver-
sity of Employment Relationships” (abbreviated as VA for
“Vielfalt der Arbeitsverhältnisse”) survey. The VA survey
was an employer-employee survey conducted in Germany
and jointly sponsored by the German Federal Ministry of La-
bor and Social Aairs (BMAS) and the Institute for Employ-
ment Research (IAB) in Nuremberg. Like many other coun-
tries, Germany has adopted various measures of labor market
deregulation. As a consequence, since the 1980s, the share
of flexible, non-standard employment has increased strongly.
There is a controversial debate on the benefits and risks of
non-standard employment for individuals. On the one hand,
non-standard jobs can be beneficial for unemployed indi-
viduals. On the other hand, there is ample evidence that
non-standard jobs provide comparatively unfavorable em-
ployment conditions. However, more research is needed into
equal treatment regarding basic employment rights. The VA
survey therefore aimed at providing new data to address the
question of whether standard and non-standard workers are
treated equally with regard to paid sick leave, paid vacation,
and paid public holidays, as prescribed by German labor law.
The sample for the VA survey was drawn from an IAB
employment database containing the universe of all German
workers who are liable to social security contributions. Non-
target groups, such as apprentices, workers in private house-
holds, and a few other worker groups (e.g. family workers)
were removed from the sampling frame. Workers of the tem-
porary agency worker industry were further removed as the
data did not allow identifying the establishment where the
temporary agency worker is actually working (which would
be crucial for the employer-employee-design). Workers of
extraterritorial organizations (e.g. embassies) have also been
removed as the conditions for the implementation of the ap-
plicable labor law can be very dierent due to the diplomatic
privilege. Finally, the database was restricted to workers
from establishments with a minimum of 11 employees who
are social security contributors.2
The IAB database contains an establishment identifier.
Thereby, a random sample of 3,003 establishments was
drawn from the IAB employment database on the refer-
ence date 31st December 2012. Establishments were drawn
with probability-proportional-to-size (PPS) sampling with
the number of employees used as the measure of size. Thus,
establishments employing a larger number of employees
were sampled with higher probabilities of selection com-
pared to establishments with fewer employees. Within se-
lected establishments, the workforce was stratified into four
groups (marginal part-time worker, part-time worker, worker
with fixed-term contracts, and other workers) and within
each stratum a random sample of employees was then drawn
from the IAB employment database and invited to take part
in the employee survey. A total of 48,006 individuals were
selected into the employee sample.
Data collection was conducted between November 2013
and April 2014 by the survey institute infas. All interviews
were conducted by computer-assisted telephone interviewing
(CATI). The IAB employment database contains names, ad-
dresses and, to some extent, telephone numbers for employ-
ees. The contact information was enriched by infas on the
basis of their own commercial directory research. Telephone
numbers could be found for about 69 percent of the sam-
ple. Valid interviews were conducted with a total of 7,561
employees from 1,110 participating establishments. This
yielded a raw employee response rate of 15.8 percent (Re-
sponse Rate 1; American Association for Public Opinion Re-
search, 2016. However, the employee survey used a screen-
ing procedure to exclude persons who were not employed at
the time of the survey.3Therefore, not all of the workers in
the sampling frame were actually eligible to take part in the
survey. Hence, the study reported also an American Asso-
ciation for Public Opinion Research (2016) Response Rate
3 of 24.7 percent, which accounts for the screening process.
The response rate is similar to those of other telephone sur-
2This threshold was chosen based on design reasons and
content-related considerations given the needs of the German Fed-
eral Ministry of Labor and Social Aairs (BMAS).
3The information in the IAB database refer to the situation of
the workers approximately 11 months prior to the start of data col-
lection.
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 293
veys of employed populations in Germany (e.g. Apel et al.,
2012; Eckman et al., 2014). Full details of the data collec-
tion procedures and outcomes can be found in the methods
report (Schütz, Harand, Kleudgen, Aust, & Weißpflug, 2014,
available upon request).4
The interview took, on average, thirty minutes to complete
and included questions on several topics related to, among
others, employment rights, (desired) working hours, and gen-
eral working conditions.
4.2 Experimental Design
All respondents were asked for explicit consent to link
their survey answers to federal employment records of the
IAB.5The consent request was administered after having
asked several, rather general questions about the worker’s
employment relationship (after the first third of the question-
naire). Each respondent was read the following statement as
part of the linkage consent request [English translation; see
appendix for original German version]:
“We would like to include in the evaluation
of the survey, extracts from data that are avail-
able at the Institute for Employment Research
of the Federal Employment Agency in Nurem-
berg. For example, information about previous
periods of employment and unemployment. For
the purpose of merging this data to the interview
data I would like to ask you for your consent. It
is absolutely certain that all data protection reg-
ulations are strictly adhered to. Your consent is
of course voluntary. You can also withdraw it at
any time. Do you agree?”
A linkage consent experiment was carried out to deter-
mine whether prefacing the above statement with a statement
highlighting a specific benefit of record linkage would im-
prove the linkage consent rate. Respondents were randomly
allocated with equal probability to one of three experimental
conditions: 1) time savings; 2) improved study value; and 3)
control. Respondents assigned to the time savings condition
(n=2,580) were read the following statement immediately
prior to being read the standard consent statement (above):
“To keep the interview as short as possible .. . ” Respondents
assigned to the improved study value condition (n=2,417)
were read a dierent prefacing statement emphasizing the
improved value of the study if linkage were to occur: “The
informative value of this study can be significantly improved
if we can supplement your information with additional data.”
Respondents allocated to the control condition (n=2,564)
were simply read the above standard consent statement with-
out any prefacing words.
4.3 Operationalization and Statistical Analysis
We hypothesize that both benefit framing conditions
should motivate respondents to consent to linkage at a higher
rate than respondents in the control condition (hypothesis 1
for the time savings argument and hypothesis 2 for the im-
proved study value argument). In order to test this hypothesis
(positive main eect of the time savings and improved study
value arguments) it is sucient to compare linkage consent
rates between treatment groups.
For hypotheses 3 and 4, the hypothesized constructs are
operationalized as follows. For hypothesis 3, busyness is
measured using two indicators collected in the survey: Ac-
tual weekly working hours and number of children under the
age of 14 (Vercruyssen, Roose, Carton, & van de Putte, 2014;
Vercruyssen, Roose, & van de Putte, 2011; Vercruyssen, van
de Putte, & Stoop, 2011). The original survey questions
can be found in the appendix. For hypothesis 4, education
is measured using three levels of general school degrees in
Germany (low, intermediate, high).6Having a self-interest
in the improved value of the study is measured by the num-
ber of worker’s rights that are withheld by the employer.7
Although the linkage consent question was asked before the
respondents knew about the workers’ rights questions, the in-
troduction presented at the beginning of the interview made
it clear that this topic would be covered in the study:
“Our study ‘Diversity of Working
Conditions’—a survey of employees in
4The dataset used for this project is a cross-sectional worker-
level survey data set made available by the German Federal Em-
ployment Agency (Bundesagentur für Arbeit) and the Institute
for Employment Research (Institut für Arbeitsmarkt- und Berufs-
forschung, IAB). The data is confidential and subject to restricted
access due to data protection legislation. Researchers can access the
data of this project at the IAB. We make all programs and a read-me
file with instructions to replicate our results available upon request.
Please refer to IAB Project 1495 (“Situation atypisch Beschäftigter
und Arbeitszeitwünsche von Teilzeitbeschäftigten: Quantitative
und qualitative Erhebung sowie begleitende Forschung”).
5The administrative database contains information on workers’
age, sex, education, occupation, wages, and industry aliation
among others. It does not contain information on employment
rights and working conditions. Our analysis relies solely on the
survey data.
6Education “Low” refers to people who finished their highest
level of schooling with or without a degree. The “Intermediate”
group refers to people who received a Mittlere Reife or Realschu-
labschluss, or a Polytechnische Oberschule. The “High” group in-
cludes those who received a Fachhochschulreife or Abitur.
7The number of withheld worker’s rights variable is constructed
as the sum of three binary indicator variables denoting whether em-
ployees were illegally denied paid vacation, paid sick leave, or paid
public holidays. More information about this variable’s construc-
tion is provided in the appendix.
294 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
Germany—examines how dierent the working
conditions, working hours and working time re-
quirements of employees in German companies
are. Above all, we are interested in the personal
assessments and experiences of the employees.
I would like to interview you now.”
In order to address hypotheses 3 and 4 the treatment eect
for each of the two benefit framing conditions is examined
relative to the control condition for the subgroup variables of
interest. However, as these additional variables are uncon-
founded with the treatment, but potentially confounded with
each other, a logistic regression model of linkage consent on
the treatment variable, the hypothesized subgroup variables,
and interaction terms between the treatment and these sub-
group variables is fitted. The regression model also includes
some additional control variables. The interaction terms will
inform us whether there are subgroup dierences in the treat-
ment eects.
The logistic regression model can be expressed as follows:
ln πi
1πi!=α+βXi+λYi+δZi+ ΩXiZi
where πiis the conditional probability of consent to data link-
age for respondent i,αis the model intercept, βis the vector
of model parameters for the experimental covariates Xi(time
savings, improved study value, and control group), λis the
vector of model parameters for the control covariates Yi(age,
sex, and immigrant background), δis the vector of model
parameters for the hypothesized indicator covariates Zi(ed-
ucation, number of worker’s rights withheld, actual weekly
working hours, and number of children), and is the vector
of model parameters for the interaction between the experi-
mental covariates Xiand the hypothesized indicator covari-
ates Zi.
All analyses were performed using the survey (svy) com-
mands in Stata/MP 14.2 and account for weighting,8stratifi-
cation, and clustering of employees within interviewers and
establishments.
5 Results
Appendix Table 1 shows the compositional distribution of
respondents allocated to each experimental condition. There
are no statistically significant dierences between the ex-
perimental conditions with respect to the distribution of the
following respondent characteristics: age (in years; 15–32,
33–45, 46–53, 54–84), sex, immigrant background (1st/2nd
generation), and the hypothesized indicator variables: actual
weekly working hours (25 or less, 25–39, 40 or more), num-
ber of children (0, 1, 2 or more), education (low, intermedi-
ate, high), and number of worker’s rights withheld (0, 1 or
more). This suggests that the random assignment procedure
yielded balanced samples.
5.1 Linkage Consent Rates by Subgroup
A total of 6,370 out of 7,561 respondents consented to
record linkage for an overall linkage consent rate of 84.7 per-
cent (weighted). Similar consent rates have been reported in
other IAB-sponsored surveys (Bender, Fertig, Gorlitz, Hu-
ber, & Schmucker, 2008; Christoph et al., 2008; Sakshaug
et al., 2013). Table 1 shows overall linkage consent rates for
each respondent subgroup. Linkage consent rates are similar
within most subgroups with the exceptions of sex, education,
and actual working hours: linkage consent rates are statisti-
cally significantly higher among females, respondents with
“Intermediate” or “High” levels of education, and those who
work 40 or more hours per week.
5.2 Linkage Consent Rates by Experimental Conditions
We hypothesized that respondents who received either of
the two benefit wording statements would consent to link-
age at a higher rate than respondents who did not receive a
benefit wording statement (control). Linkage consent rates
for each experimental condition are shown in Figure 1. The
“time savings” condition (86.0 percent) yields the highest
consent rate overall, followed by the “improved study value”
condition (84.4 percent), and the control condition (83.8 per-
cent). The consent rate dierence between the time savings
condition and the control condition is statistically significant
(t348 =1.94; pvalue =0.027, one-sided). The consent rate
of the improved study value condition, on the other hand,
does not dier much from the consent rate of the control
condition (t344 =0.53;pvalue =0.298, one-sided). Thus,
these results provide modest support for hypothesis 1, but not
hypothesis 2.
5.3 Treatment Eects Across Respondent Subgroups
This section now examines the possibility that benefit
framing may dierentially aect willingness to consent for
specific respondent subgroups relevant to hypotheses 3 and
4. Figure 2 shows treatment eects (deviation in consent
rates between the two experimental conditions and the con-
trol condition) across the respondent subgroups. Similar to
the overall consent rate results presented in Section 5.2, the
figure shows that the magnitude of the eect of the two ex-
perimental groups is generally small and close to zero for
most of the relevant subgroups. The largest eects can be
seen for the time savings argument: respondents with two or
more children and respondents who work at least 40 hours
8The weighting was carried out by poststratifying on relevant
characteristics of the sampling scheme to the target population us-
ing IAB administrative data available for every population unit. The
basis for the calculation of the weighting factors was the distribution
of the workforce across 5 establishment size classes and 4 employee
groups (marginal part-time worker, part-time worker, worker with
fixed-term contract, and other workers).
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 295
Table 1
Overall Linkage Consent Rates by Respondent Characteristics and Experimental Conditions.
Experimental conditions
Time Improved
Overall Control savings study value
Respondent characteristics % SE % SE % SE % SE
Age (in years)
15–32 84.46 1.15 84.80 1.88 85.15 1.83 83.42 1.87
33–45 83.38 1.03 81.40 1.93 85.39 1.53 83.30 1.72
46–53 84.66 0.93 83.48 1.75 85.81 1.50 84.81 1.68
54–84 86.38 0.95 85.78 1.63 87.59 1.66 85.73 1.65
Sex
Female 86.42*0.73 85.53*1.15 87.80*1.16 85.88 1.34
Male 83.00*0.73 82.03*1.20 84.17*1.10 82.84 1.22
Immigrant background
Yes 83.06 1.11 81.69 2.22 84.79 1.89 82.63 1.88
No 85.30 0.56 84.63 0.93 86.45 0.91 84.82 1.02
Hypothesized indicators
Actual weekly working hours
25 or less 82.86 1.00 83.17 1.91 83.32 1.57 82.01*1.69
25 – 39 84.14 1.05 85.08 1.65 85.06 1.59 82.23*1.73
40 or more 85.78 0.67 83.34 1.25 87.77 1.14 86.28*1.19
# of children
0 85.57 0.62 85.72*1.04 86.03 1.01 84.94 1.13
1 83.14 1.35 82.68*2.25 85.98 2.13 80.36 2.49
2 or more 84.93 1.41 79.79*2.53 90.39 1.84 83.99 2.46
Education
Low 81.76*1.22 81.22 2.08 81.27*2.11 82.91 1.95
Intermediate 85.11*0.80 83.43 1.44 87.55*1.20 84.25 1.34
High 85.84*0.79 85.37 1.34 86.99*1.29 85.19 1.56
# of worker’s rights withheld
0 85.04 0.54 83.58 0.97 86.50 0.88 85.08 0.96
1 or more 83.30 1.50 83.14 2.30 83.99 2.36 82.74 2.24
N 7,561 2,564 2,580 2,417
*p<0.05, two-sided, chi-square test
Control
Time savings
Improved study value
82 84 86 88
Linkage Consent Rate (in %)
Figure 1. Linkage Consent Rates by Experimental Condi-
tion. Error Bars are 95% Confidence Intervals.
per week—both hypothesized indicators of busyness—are
more likely to be persuaded by the time savings argument.
Thus, there is some evidence that tailoring the linkage con-
sent request for these subgroups may increase respondents’
willingness to give consent. No such eect is seen for the
improved study value argument.
5.4 Linkage Consent Model with Experimental Condi-
tion and Subgroup Interactions
As previously noted, one limitation of looking at the main
eects of the experimental conditions on the indicators of the
hypothesized constructs ignores the potential confounding of
the indicator variables with each other. To mitigate the eects
of confounding and formally test hypotheses 3 and 4, a logis-
296 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
Actual weekly working hours
# of children
Education
# of worker’s rights withheld
25 or less
25−39
40 or more
None
One
Two or more
Low
Intermediate
High
None
One or more
−5 0 5 10 15
Time savings Improved study value
Figure 2. Consent Rate Deviations Between the Two Benefit
Framing Conditions and the Control Condition for Respon-
dent Subgroups.
tic regression model of linkage consent on control variables
and the interaction between the experimental conditions and
indicators of the two hypothesized constructs: busyness and
self-interest in the improved value of the study, is presented.
The focus will thus be on the four interaction eects (two
indicators for each argument).
The results, presented in Table 2, show a positive interac-
tion between the time savings argument and both busyness
indicators. That is, respondents who work forty or more
hours per week or have two or more children are more likely
to consent to record linkage when presented with the time
savings argument. These results are consistent with the no-
tion that the time savings argument has a more positive eect
on busy respondents, i.e. hypothesis 3 is supported.
Turning now to hypothesis 4, the model results show
no statistically significant interaction between the improved
study value condition and education; thus, there is no evi-
dence to support the notion that the improved study value
argument has a positive eect on the most highly educated
respondents. Similarly, there is no statistically significant in-
teraction between the improved study value argument and re-
spondents whose employee rights are being withheld or who
do not work their preferred hours.
6 Discussion
For researchers interested in collecting survey data and
linking them to administrative data it is useful to consider
design strategies to optimize the linkage consent rate and
minimize the risk of non-consent bias. Designing the link-
age consent request in a way that highlights particular bene-
fits of record linkage to survey respondents is one relatively
straightforward strategy that may be considered for this pur-
pose. The present study found mixed success for this strategy
in terms of improving respondent willingness to give linkage
consent. Specifically, the study showed that respondents to a
workers’ rights survey who were presented with a time sav-
ings argument were more likely to consent to record linkage
relative to a neutral request. However, emphasizing a sep-
arate benefit related to improved value of the study did not
improve respondents’ likelihood of consent. A small inter-
view monitoring exercise suggested that these findings were
unlikely to have been caused by deviant interviewer behavior
(see appendix for details).
In addition, the study revealed statistically significant in-
teractions of benefit framing for particular subgroups that
could be utilized for tailoring the consent request. Specif-
ically, the time saving argument had a positive eect on
“busy” respondents: those who work at least 40 hours and/or
have multiple children were more likely to give consent when
presented with the time savings argument. Contrary to our
expectations, the improved study value argument had no ef-
fect on respondents who experience a withholding of their
worker’s rights and would be most likely to personally ben-
efit from the improved value of the study. The study also
found no eect of the improved study value argument on
respondents with the highest level of education, whom we
hypothesized would be more receptive to the idea of con-
tributing to improving the value of the scientific study.
While these findings bring useful insights to survey practi-
tioners, there are a few study issues worth discussing. For in-
stance, the experiment was embedded within a low response
rate survey which may have contributed to the high overall
consent rate. Although the response rate is similar to that
of other telephone surveys of employed populations in Ger-
many (e.g. Apel et al., 2012; Sakshaug et al., 2013), the re-
spondents are likely to be more cooperative with the con-
sent request compared to the non-respondents had they been
posed the consent question. Busyness is one of the reasons
why people choose not to participate in surveys (Bates et
al., 2008; Vercruyssen, Roose, & van de Putte, 2011; Ver-
cruyssen, van de Putte, & Stoop, 2011); thus, while a higher
response rate would be expected to coincide with a lower
consent rate, we would also expect the eect of the time
savings argument to be stronger if more busy and generally
less cooperative people were recruited into the survey. Thus,
the estimated eect of the time savings argument reported in
this study may be considered as a lower bound of the actual
phenomenon. People concerned with improving the value of
scientific studies are likely to cooperate with scientific sur-
vey requests. Thus, we would not expect the null eect of
the improved study value argument to change given a higher
response rate.
A second issue is that multiple hypotheses have been
tested on the same dataset. Although only three hypotheses
have been put forward, six indicators have been tested for the
significance to reject or support these hypotheses. This mul-
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 297
Table 2
Logistic Regression of Linkage Consent on Respondent Subgroups and Interactions Between Ex-
perimental Conditions and Respondent Subgroups.
Interaction terms
Main eect terms Time savings Improved study value
Covariates Coef. SE Coef. SE Coef. SE
Experimental conditions
Control ref. - - - - -
Time savings 0.50 0.30 - - - -
Improved study value 0.26 0.31 - - - -
Control variables
Age (in years)
15–32 ref. - - - - -
33–45 0.002 0.14 - - - -
46–53 0.08 0.15 - - - -
54–84 0.21 0.15 - - - -
Sex
Male ref. - - - - -
Female 0.29*0.11 - - - -
Immigrant background
Yes 0.15 0.11 - - - -
No ref. - - - - -
Hypothesized indicators
Actual weekly working hours
25 or less ref. - ref. - ref. -
25–39 0.14 0.22 0.04 0.30 0.20 0.30
40 or more 0.16 0.20 0.56*0.28 0.44 0.27
# of children
0 ref. - ref. - ref. -
10.21 0.20 0.20 0.29 0.12 0.27
2 or more 0.38 0.20 0.91*0.29 0.42 0.29
Education
Low ref. - ref. - ref. -
Intermediate 0.10 0.21 0.51*0.26 0.25 0.30
High 0.31 0.21 0.11 0.27 0.05 0.30
# of worker’s rights withheld
0 ref. - ref. - ref. -
1 or more 0.05 0.23 0.07 0.29 0.19 0.31
Intercept term 1.49*0.26 - - - -
*p<0.05, two-sided
tiple testing increases the probability of rejecting at least one
null-hypothesis by chance (alpha error). Although formal
methods have been developed to correct for multiple testing
(e.g. the Bonferroni correction Dunn, 1961), these decrease
the statistical power of each single test substantially which
is an issue with the number of cases available. Rather than
applying the Bonferroni correction, we argue that while both
indicators (education, withheld worker’s rights) lack signifi-
cance for the improved study value argument, both indicators
for busyness (2+children, 40+working hours) show a sig-
nificant eect for the time savings argument. This is a con-
sistent pattern that is very unlikely to be the result of an alpha
error. If the strict Bonferroni correction is applied (dividing
the p-level by the number of hypotheses), only one of the two
eects (2+children) retains significance.
Although benefit framing has been shown to have a pos-
itive eect on attitudes towards data sharing in hypotheti-
cal scenarios (Bates, Wroblewski, and Pascale 2012), their
298 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
eects have been lackluster in actual linkage applications,
particularly in telephone studies (e.g. Kreuter et al., 2015;
Sakshaug et al., 2013). The present study provides another
data point suggesting that emphasizing the benefits of record
linkage is unlikely to have a dramatic impact on respondents’
decision-making process when considering the linkage re-
quest. However, the benefit framing eects observed for spe-
cific subgroups points to a possible strategy of tailoring the
linkage consent request for specific individuals. Just as sur-
veys tailor their recruitment strategies to address people who
cite time pressures and “busyness” as reasons for not partic-
ipating in survey research (Olson, Lepkowski, & Garabrant,
2011), a tailoring strategy that emphasizes the time savings
argument for respondents who are previously identified as
being “busy” at the time of the survey request, or based on
answers to previous survey questions, could be considered.
A possible extension of this research relates to the opti-
mal delivery of the benefit argument by interviewers during
the linkage consent request. One plausible explanation for
the small (less than 3 percentage point) main eect of benefit
framing could be due to inadequate saliency of the benefit
statement during its delivery. The benefit statement was read
to respondents at the very beginning of the consent request
and was followed by a longer statement describing the ad-
ministrative data, data protection aspects, and the voluntary
nature of the request. The saliency of the benefit argument
therefore could have diminished over the course of deliver-
ing the full statement and lost its influence by the time re-
spondents were prompted for an answer. Furthermore, inter-
viewers were not instructed to add any vocal emphasis when
reading the benefit statements, which could have further con-
tributed to the lack of salience of the benefit statement. Ex-
perimenting with dierent levels of vocal emphasis when de-
livering the benefit argument as well as the proximity of the
benefit argument relative to the point at which respondents
are asked for an answer are both topics to be considered
in future research (see e.g. Sakshaug, Schmucker, Kreuter,
Couper, & Singer, 2019).
The eect of consent framing in self- versus interviewer-
administered modes is another relevant topic for future re-
search. On the one hand, self-administered modes, such as
web, ensure that the entire framing argument is presented
to respondents, whereas in interviewer-administered modes
there is no assurance that the argument is read or empha-
sized to respondents. On the other hand, respondents in self-
administered surveys may be less likely to read the complete
consent statement. Thus, benefit framing arguments may be
more (or perhaps less) salient to respondents when presented
under alternative survey modes.
References
American Association for Public Opinion Research. (2016).
Standard definitions: Final dispositions of case codes
and outcome rates for surveys. Retrieved from https:
//www.aapor.org/AAPOR_Main/media/publications/
Standard-Definitions20169theditionfinal.pdf
Apel, H., Bachmann, R., Bender, S., vom Berge, P.,
Fertig, M., Frings, H., . . . Wolter, S. (2012). Ar-
beitsmarktwirkungen der Mindestlohneinführung im
Bauhauptgewerbe. Journal for Labour Market Re-
search,45(3/4), 257–277.
Bates, N., Dahlhamer, J., & Singer, E. (2008). Privacy con-
cerns, too busy, or just not interested: Using doorstep
concerns to predict survey nonresponse. Journal of Of-
ficial Statistics,24(4), 591–612.
Bates, N., Wroblewski, M., & Pascale, J. (2012). Public atti-
tudes toward the use of administrative records in the
u.s. census: Does question frame matter? Technical
Report, Survey Methodology Series #2012-04, United
States Census Bureau. Retrieved from https: / / www.
census.gov/srd/papers/pdf/rsm2012-04.pdf
Bender, S., Fertig, M., Gorlitz, K., Huber, M., & Schmucker,
A. (2008). WeLL—unique linked employer-employee
data on further training in Germany. Ruhr Economic
Papers (No. 67).
Buck, N. & McFall, S. (2012). Understanding Society: De-
sign overview. Longitudinal and Life Course Studies,
3(1), 5–17.
Christoph, B., Müller, G., Gebhardt, D., Wenzig, C., Trapp-
mann, M., Achatz, J., . . . Gayer, C. (2008). Code-
book and documentation of panel study “Labour Mar-
ket and Social Security” (PASS). Vol. 1: Introduction
and overview, wave 1 (2006/2007). number 05. FDZ
Datenreport. Documentation on Labour Market Data
200805_en.
da Silva, M., Coeli, C., Ventura, M., Palacios, M., Mag-
nanini, M., Camargo, T., & Camargo, K. J. (2012).
Informed consent for record linkage: A systematic re-
view. Journal of Medical Ethics,38(10), 639–642.
Dahlhamer, J., Simile, C., & Taylor, B. (2008). Do you really
mean what you say? Doorstep concerns and data qual-
ity in the National Health Interview Survey (NHIS).
Proceedings of the Joint Statistical Meetings of the
Ammerican Statistical Association, 1484–1491.
Dunn, O. (1961). Multiple comparisons among means. Jour-
nal of the American Statistical Association,56, 52–64.
Eckman, S., Kreuter, F., Kirchner, A., Jäckle, A.,
Tourangeau, R., & Presser, S. (2014). Assessing
the mechanisms of misreporting to filter questions in
surveys. Public Opinion Quarterly,78(3), 721–733.
Fulton, J. (2012). Respondent consent to use administrative
data. University of Maryland, PhD dissertation: Col-
lege Park, MD.
General Data Protection Regulation. (2016). Regulation
(EU) 2016/679 of the European parliament and of the
council. Retrieved from http://eur-lex.europa.eu/legal-
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 299
content/EN/TXT/PDF/?uri=CELEX:32016R0679%
5C&from=en
Goyder, J. (1985). Nonresponse on surveys: A Canada–
United States comparison. Canadian Journal of Soci-
ology,10(3), 231–251.
Groves, R. & Couper, M. (1998). Nonresponse in household
interview surveys. New York: John Wiley and Sons.
Groves, R. & McGonagle, K. (2001). A theory-guided in-
terviewer training protocol regarding survey participa-
tion. Journal of Ocial Statistics,17(2), 249–265.
Groves, R., Singer, E., & Corning, A. (2000). Leverage-
saliency theory of survey participation: Description
and an illustration. Public Opinion Quarterly,64(3),
299–308.
Kahneman, D. & Tversky, A. (1979). Prospect theory: An
analysis of decisions under risk. Econometrica,47(2),
263–291.
Kahneman, D. & Tversky, A. (1984). Choices, values, and
frames. American Psychologist,39(4), 341–350.
Kreuter, F., Sakshaug, J., Schmucker, A., Singer, E., &
Couper, M. (2015). Privacy, data linkage, and in-
formed consent. Presentation at the 70th Annual Con-
ference of the American Association for Public Opin-
ion Research, Hollywood,FL, May.
Kreuter, F., Sakshaug, J., & Tourangeau, R. (2016). The
framing of the record linkage consent question. Inter-
national Journal of Public Opinion Research,28(1),
142–152.
Longitudinal Studies Strategic Review. (2018). 2017 report
to the Economic and Social Research Council. Re-
trieved from https:// esrc.ukri.org/files/news-events-
and - publications/publications/longitudinal- studies -
strategic-review-2017/
Michaud, S., Dolson, D., Adams, D., & Renaud, M. (1995).
Combining administrative and survey data to reduce
respondent burden in longitudinal surveys. Paper pre-
sented at the Joint Statistical Meetings of the American
Statistical Association, Orlando, FL, USA.
Miller, D., Gindi, R., & Parker, J. (2011). Trends in
record linkage refusal rates: Characteristics of na-
tional health interview survey participants who refuse
record linkage. Presentation at the Joint Statistical
Meetings, Miami, FL, August.
Morton-Williams, J. (1993). Interviewer approaches. Alder-
shot: Dartmouth Publishing Company.
Olson, K., Lepkowski, J., & Garabrant, D. (2011). An ex-
perimental examination of the content of persuasion
letters on nonresponse rates and survey estimates in a
nonresponse follow-up study. Survey Research Meth-
ods,5(1), 21–26.
Parsons, V., Moriarity, C., Jonas, K., Moore, T., Davis, K.,
& Tompkins, L. (2014). Design and estimation for the
national health interview survey. National Center for
Health Statistics. Vital and Health Statistics,2(165),
2006–2015.
Pascale, J. (2011). Requesting consent to link survey data
to administrative records: Results from a split-ballot
experiment in the survey of health insurance and pro-
gram participation. Technical Report, Survey Method-
ology Series #2011-03, United States Census Bureau.
Retrieved from https://www. census.gov/srd/papers/
pdf/ssm2011-03.pdf
Sakshaug, J. & Kreuter, F. (2012). Assessing the magnitude
of non-consent biases in linked survey and administra-
tive data. Survey Research Methods,6(2), 113–122.
Sakshaug, J. & Kreuter, F. (2014). The eect of benefit
wording on consent to link survey and administrative
records in a web survey. Public Opinion Quarterly,
78(1), 166–176.
Sakshaug, J., Schmucker, A., Kreuter, F., Couper, M. P., &
Singer, E. (2019). The eect of framing and placement
on linkage consent. Public Opinion Quarterly,83(S1),
289–308. doi:10.1093/poq/nfz018
Sakshaug, J., Tutz, V., & Kreuter, F. (2013). Placement,
wording, and interviewers: Identifying correlates of
consent to link survey and administrative data. Survey
Research Methods,7(2), 133–144.
Sakshaug, J., Wolter, S., & Kreuter, F. (2015). Obtaining
record linkage consent: Results from a wording exper-
iment in Germany. Survey Insights: Methods from the
Field. Retrieved from http:/ / surveyinsights. org /?p =
7288
Sala, E., Knies, G., & Burton, J. (2014). Propensity to con-
sent to data linkage: Experimental evidence on the
role of three survey design features in a UK longitu-
dinal panel. International Journal of Social Research
Methodology,17(5), 455–473.
Schütz, H., Harand, J., Kleudgen, M., Aust, N., &
Weißpflug, A. (2014). Methodenbericht: Situation
atypisch Beschäftigter und Arbeitszeitwünsche von
Teilzeitbeschäftigten. Bonn: Institut für angewandte
Sozialwissenschaft (Infas).
Trappmann, M., Bähr, S., Beste, J., Eberl, A., Frodermann,
C., Gundert, S., . . . Wenzig, C. (2019). Data resource
profile: Panel Study Labour Market and Social Secu-
rity (PASS). International Journal of Epidemiology,
48(5), 1411–1411g. doi:10.1093/ije/dyz041
Trappmann, M., Beste, J., Bethmann, A., & Müller, G.
(2013). The PASS panel survey after six waves. Jour-
nal for Labour Market Research,46(4), 275–281.
US Commission on Evidence-Based Policymaking. (2017).
The promise of evidence-based policy-making: Report
of the commission on evidence-based policymaking.
Retrieved from https://www.cep.gov/content/dam/cep/
report/cep-final-report.pdf
300 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
US National Academies of Sciences, Engineering, and
Medicine. (2017). Federal statistics, multiple data
sources, and privacy protection: Next steps. Washing-
ton, DC: The National Academies Press. doi:https://
doi.org/10.17226/24893
Vercruyssen, A., Roose, H., Carton, A., & van de Putte, B.
(2014). The eect of busyness on survey participation:
Being too busy or feeling too busy to cooperate? In-
ternational Journal of Social Research Methodology,
17(4), 357–371.
Vercruyssen, A., Roose, H., & van de Putte, B. (2011).
Underestimating busyness: Indications of nonresponse
bias due to work—family conflict and time pressure.
Social Science Research,40(6), 1691–1701.
Vercruyssen, A., van de Putte, B., & Stoop, I. (2011). Are
they really too busy for survey participation? The evo-
lution of busyness and busyness claims in flanders?
Journal of Ocial Statistics,27(4), 619–632.
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 301
Appendix A
Fieldwork materials
Linkage Consent Statement and Question Administered
to All Respondents
“Wir würden gerne bei der Auswertung der Be-
fragung Auszüge aus Daten einbeziehen, die
beim Institut für Arbeitsmarkt- und Berufs-
forschung der Bundesagentur für Arbeit in
Nürnberg vorliegen. Dabei handelt es sich
zum Beispiel um Informationen zu vorausge-
gangenen Zeiten der Beschäftigung und der Ar-
beitslosigkeit.
Zum Zweck der Zuspielung dieser Daten
an die Interviewdaten möchte ich Sie herzlich
um Ihr Einverständnis bitten. Dabei ist absolut
sichergestellt, dass alle datenschutzrechtlichen
Bestimmungen strengstens eingehalten werden.
Ihr Einverständnis ist selbstverständlich frei-
willig. Sie können es auch jederzeit wieder
zurückziehen.
Sind Sie damit einverstanden?”
Benefit Arguments Administered to Respondents As-
signed to the Treatment Conditions
Time savings:
“Um das Interview im Folgenden möglichst
kurz zu halten. . . ”
Improved study value:
“Die Aussagekraft dieser Studie lässt
sich deutlich verbessern, wenn wir Ihre
Angaben mit weiteren Daten ergänzen kön-
nen.”
Survey Questions Used in Analysis
1. Number of children under the age of 14:
“Und nun zur Größe Ihres Haushaltes. Wie
viele Personen leben in Ihrem Haushalt,
Kinder und Sie selbst mit eingeschlossen?
Dazu zählen auch Personen, die nor-
malerweise im Haushalt wohnen, aber
vorübergehend abwesend sind, aus beru-
flichen oder persönlichen Gründen. Nicht
dazu zählen dagegen Mitbewohner aus
Wohngemeinschaften, Untermieter oder
Hausangestellte.”
Anzahl der Personen 22
[If Filter Q >1] “Und wie viele Personen
davon sind Kinder unter 14 Jahren?”
Anzahl der Personen 22
2. Actual weekly working hours:
“Und wie viele Stunden arbeiten Sie im
Durchschnitt tatsächlich pro Woche? Bitte
berücksichtigen Sie nun auch regelmäßig
geleistete bezahlte und unbezahlte ÃIJber-
stunden, Mehrarbeit, Vor- und Nacharbeit-
szeiten, Bereitschaftsdienste sowie Arbeit
von zu Hause oder unterwegs.
Stunden 222
3. Education:
“Welchen höchsten allgemeinbildenden
Schulabschluss haben Sie?”
2Keinen Abschluss beziehungsweise noch keinen
Abschluss
2Hauptschulabschluss, Volksschulabschluss
2Mittlere Reife, Realschulabschluss, Fachschul-
reife, POS
2Fachhochschulreife, Abschluss einer Fachober-
schule
2Abitur, Hochschulreife, EOS, Berufsausbildung
mit Abitur
4. Age:
“Bevor wir mit dem eigentlichen Interview
beginnen, sagen Sie mir bitte zunächst,
wann Sie geboren sind! Nennen Sie mir
dazu bitte den Monat und das Jahr.
Monat 22
Jahr 2222
5. Immigrant background (1st/2nd generation):
“Sind Sie in Deutschland geboren?”
2Ja
2Nein
“Ist Ihr Vater in Deutschland geboren?
2Ja
2Nein
Ist Ihre Mutter in Deutschland geboren?
2Ja
2Nein
302 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
6. Number of workers’ rights withheld by employer:
This variable was generated using three survey ques-
tions (see below) referring to paid vacation, paid sick
leave, and paid public holidays, which are all le-
gal rights of employees in Germany. For each ques-
tion, a binary indicator was generated which indicated
whether a specified worker right was illegally with-
held (1) or not (0) from the respondent. An additive
index of the number of workers’ rights withheld was
then generated by summing the three binary indicator
variables. More details about the variable generation
process can be found in Fischer et al. (2015, p.319).
(a) Paid vacation
“Erhalten Sie in bezahlten Urlaub?”
2Ja
2Nein
[If Filter Q =“Nein”]
Aus welchem Grund erhalten Sie
keinen bezahlten Urlaub? Sollten
mehrere der folgenden Gründe auf
Sie zutreen, nennen Sie uns bitte
den Hauptgrund. Erhalten Sie keinen
bezahlten Urlaub . . . ”
2Weil Sie noch nicht lange genug im Betrieb
beschäftigt sind
2Weil Ihnen in Ihrer Tätigkeit kein bezahlter
Urlaub zusteht
2Weil die Personal- oder Auftragslage mo-
mentan keinen Urlaub zulässt.
2Weil die Personal- oder Auftragslage dauer-
haft keinen Urlaub zulässt.
(b) Paid sick leave:
“Wenn Sie sich bei Ihrem Arbeitgeber
krank melden, welche Auswirkungen
hat dies? Ich lese Ihnen verschiedene
Möglichkeiten vor, bitte sagen Sie mir
welche für Sie zutreen”
2Sie bekommen für die Zeit der Krankheit
Ihren regulären Lohn
2Sie müssen die Zeit der Krankheit unent-
geltlich nacharbeiten
2Sie müssen die Zeit der Krankheit über Ihr
Arbeitszeit- oder Urlaubskonto ausgleichen
2Sie müssen eine Vertretung organisieren, die
Ihre Arbeit übernimmt
(c) Paid public holidays
“Was geschieht, wenn einer Ihrer
Arbeitstage auf einen gesetzlichen
Feiertag fällt und Sie des-halb arbeits-
frei haben? Ich lese Ihnen wieder
mehrere Antwortmöglichkeiten vor.”
2Sie erhalten für den Tag ganz normal Ihren
Lohn
2Die Arbeitszeit wird Ihnen wie an einem Ar-
beitstag gutgeschrieben
2Sie müssen die ausgefallene Arbeitszeit un-
entgeltlich vor- oder nacharbeiten
2Die ausgefallene Arbeitszeit wird von Ihrem
Arbeitszeit- oder Urlaubskonto abgezogen.
DOES BENEFIT FRAMING IMPROVE RECORD LINKAGE CONSENT RATES? A SURVEY EXPERIMENT 303
Appendix B
Compositional Distribution of Respondents Allocated to Each Experimental Condition
Table B1
Distribution of Respondent Characteristics Within Experimental Conditions.
Experimental conditions
Time Improved
Control savings study value
Respondent characteristics % SE % SE % SE
Control variables
Age in years (χ2=9.298; df =6; p=0.368)
15–32 17.01 0.83 18.83 0.88 19.21 0.83
33–45 26.32 1.03 27.17 1.02 27.70 1.09
46–53 30.77 1.04 28.43 1.25 27.73 1.16
54–84 25.90 1.06 25.58 1.10 25.36 1.06
Sex (χ2=1.850; df =2; p=0.491)
Female 49.32 1.29 51.19 1.15 49.97 1.12
Male 50.68 1.29 48.81 1.15 50.03 1.12
Immigrant background (χ2=2.926; df =2; p=0.329)
Yes 18.21 0.84 19.31 0.84 20.11 1.05
No 81.79 0.84 80.69 0.84 79.89 1.05
Hypothesized indicators
Actual weekly working hours (χ2=0.834; df =4; p=0.951)
25 or less 19.67 0.94 19.61 0.83 19.22 0.86
25–39 25.15 1.01 24.85 0.98 25.92 1.07
40 or more 55.17 1.31 55.54 1.19 54.85 1.36
# of children (χ2=10.920; df =4; p=0.083)
0 71.64 1.06 67.78 1.12 68.60 1.21
1 15.75 0.91 17.71 0.98 16.12 0.96
2 or more 12.61 0.85 14.51 0.87 15.27 0.95
Education (χ2=10.558; df =4; p=0.135)
Low 20.11 0.93 19.77 0.86 19.17 0.97
Intermediate 39.55 1.22 40.03 1.14 36.70 1.32
High 40.34 1.22 40.20 1.17 44.13 1.38
Number of worker’s rights withheld (χ2=0.820; df =2; p=0.735)
0 84.76 0.83 84.24 0.84 83.79 0.92
1 or more 15.24 0.83 15.76 0.84 16.21 0.92
N 2,564 2,580 2,417
304 JOSEPH W. SAKSHAUG, JENS STEGMAIER, MARK TRAPPMANN AND FRAUKE KREUTER
Appendix C
Behavior Coding of Interviewer-Respondent Interaction
To explore the possibility that interviewers failed to read
the benefit arguments (or other parts of the consent state-
ment) exactly as scripted in the questionnaire, we imple-
mented a small interviewer monitoring exercise in which
three study investigators visited the infas call center on four
separate days (November 5th, 6th, 25th and December 9th)
and listened to a set of interviews being conducted in real-
time while simultaneously coding specific behaviors of the
respondent-interviewer interaction. Interviewers are rou-
tinely monitored for quality control purposes, but none of the
interviewers were made aware that they would be observed
by one of the study investigators on a given day. The inves-
tigators coded interviewers on whether they read the linkage
consent statements/question exactly as scripted in the ques-
tionnaire and respondents were coded on their reaction to the
consent statements/question. A total of 49 interviews were
observed and coded. Given the small sample size our inten-
tion is not to provide conclusive and generalizable estimates
of behaviors, but rather to provide a basic qualitative impres-
sion of whether the telephone interaction adversely aected
the implementation of the consent experiment.
The results of the interviewer coding exercise are
shown in Appendix Table C1. Overall, we do not find strong
evidence of interviewer deviance with respect to reading the
consent statement exactly as instructed. Interviewers read
the scripted consent statements/question verbatim in more
than two-thirds (n=35) of interviews, and only one of the
script deviations aected the benefit statement. These re-
sults (albeit, limited) provide some reassurance that the ben-
efit wording experiment wasn’t compromised by a failure
to read the benefit statement as scripted. Despite correctly
reading the consent statements, there is still the possibility
that interviewers intervened and provided non-neutral feed-
back which could have influenced respondents’ decision to
consent. There were some (n=9) instances where respon-
dents hesitated to answer the consent question and this was
met with interviewers intervening with unscripted feedback
which included mentioning a non-scripted benefit of linkage.
This deviant behavior led to a positive consent decision in
about half (n=5) of the interviews. Although these obser-
vations indicate that a small number of interviewers deviated
from their instructions and provided non-neutral feedback to
hesitant respondents, this behavior occurred evenly across
the three experimental conditions (n=3 per condition). Thus,
we conclude from this small behavior coding exercise that
interviewer deviance was unlikely to be a factor in the benefit
framing experiment.
Table C1
Observed Behaviors Coded During the Interviewer-
Respondent Interaction.
n%
Experimental conditions
Control 21 42.9
Time savings 14 28.6
Improved study value 14 28.6
Interviewer delivery of consent question
Read script as worded 35 71.4
Changed script, but not benefit wording 13 26.5
Changed script and benefit wording 1 2.0
Respondent reaction to consent question
R provided answer without hesitation 31 63.2
R hesitated, Interviewer gave neutral feedback 9 18.4
R hesitated, Interviewer gave additional benefit 9 18.4
Total 49 100.0
... Asking for consent after a module of questions related to the content of the data to be linked increases consent compared to asking at the end of the questionnaire (Sala et al. 2014), and asking it at the beginning of the survey rather than the end has a positive effect (Sakshaug, Schmucker, Kreuter, Couper, and Singer 2019). Several experiments have varied the framing of the request, with mixed results and generally modest effects (see Sakshaug et al. 2013;Sakshaug, Wolter, and Kreuter 2015;Kreuter, Sakshaug, and Tourangeau 2016;Sakshaug, Stegmaier, Trappmann, and Kreuter 2019;. ...
Article
Full-text available
It is increasingly common for researchers to link survey data to administrative data. If several administrative data sources are of interest, respondents are required to give consent to each of them, meaning that multiple consent questions have to be included in one survey. Existing literature suggests that individual consent varies widely between data sources and over time, but little is known about how respondents process multiple consent requests in a single survey. Using an online access panel in Great Britain, we conducted a set of experiments in two surveys to explore multiple consent requests (covering five domains or data sources). In the first study we experimentally varied the format of the request, testing three versions: 1) a sequence of pages (with one response per domain), 2) all five requests on the same page (with one response per domain), and 3) a single request (with one joint request covering all five domains). We also varied the order of the domains. We find that average consent rates do not differ by format, but asking a less sensitive or easier-to-comply request first yields slightly higher average consent rates than asking a more sensitive request first. We repeated the order experiment in a second study, using an independent sample from the same panel, and adding two more order conditions. We find average consent rates are not affected much by order, but the consent to individual domains is affected by order. However, we fail to replicate the pattern of consents found in the first study. We conclude that the order in which multiple consent requests is asked does matter, but in complicated ways that depend on the particular outcomes in which one is interested. Objective knowledge and subjective comprehension of the consent process, and confidence in the decision are largely unaffected by format or order.
... In addition, features of the request affect participants' sharing decisions. Studies on the consent to link survey data to administrative data find positive effects of benefit framing in terms of time savings (e.g., Sakshaug, Stegmaier, et al. 2019), but this has failed to replicate in sensing studies (Silber et al. 2018;Struminskaya et al. 2020). Panel members, the participants in these studies, might see less benefit in being asked fewer questions-their membership in the panel suggests they are not particularly burdened by answering survey questions-than members of a cross-sectional sample for whom the reduced burden of fewer questions may appear more beneficial. ...
Article
Full-text available
Smartphone sensors allow measurement of phenomena that are difficult or impossible to capture via self-report (e.g., geographical movement, physical activity). Sensors can reduce respondent burden by eliminating survey questions and improve measurement accuracy by replacing/augmenting self-reports. However, if respondents who are not willing to collect sensor data differ on critical attributes from those who are, the results can be biased. Research on the mechanisms of willingness to collect sensor data mostly comes from (nonprobability) online panels and is hypothetical (i.e., asks participants about the likelihood of participation in a sensor-based study). In a cross-sectional general population randomized experiment, we investigate how BELLA STRUMINSKAYA is an assistant professor in the features of the request and respondent characteristics influence willingness to share (WTS) and actually sharing smartphone-sensor data. We manipulate the request to either mention or not mention (1) how participation will benefit the participant, (2) participants' autonomy over data collection, and (3) that data will be kept confidential. We assess nonparticipation bias using the administrative records. WTS and actually sharing varies by sensor task, participants' autonomy over data sharing, their smartphone skills, level of privacy concerns, and attitudes toward surveys. Fewer people agree to share photos and a video than geolocation, but all who agreed to share photos or a video actually did. Some nonresponse and nonparticipation biases are substantial and make each other worse, but others jointly reduce the overall bias. Our findings suggest that sensor-data-sharing decisions depend on sample members' situation when asked to share and the nature of the sensor task rather than the sensor type.
... Some research has shown that framing of the consent request in terms of benefits (i.e., as reduced burden) increased respondents' willingness to share data from the administrative records. For example, Bates, Wroblewski, and Pascale (2012) found that framing as a reduced effort increased hypothetical willingness, and Sakshaug and Kreuter (2014) and Sakshaug et al. (2019) found that time-savings framing significantly increased consent rates, compared to neutral framing. ...
Article
Full-text available
The growing smartphone penetration and the integration of smartphones into people’s everyday practices offer researchers opportunities to augment survey measurement with smartphone-sensor measurement or to replace self-reports. Potential benefits include lower measurement error, a widening of research questions, collection of in situ data, and a lowered respondent burden. However, privacy considerations and other concerns may lead to nonparticipation. To date, little is known about the mechanisms of willingness to share sensor data by the general population, and no evidence is available concerning the stability of willingness. The present study focuses on survey respondents’ willingness to share data collected using smartphone sensors (GPS, camera, and wearables) in a probability-based online panel of the general population of the Netherlands. A randomized experiment varied study sponsor, framing of the request, the emphasis on control over the data collection process, and assurance of privacy and confidentiality. Respondents were asked repeatedly about their willingness to share the data collected using smartphone sensors, with varying periods before the second request. Willingness to participate in sensor-based data collection varied by the type of sensor, study sponsor, order of the request, respondent’s familiarity with the device, previous experience with participating in research involving smartphone sensors, and privacy concerns. Willingness increased when respondents were asked repeatedly and varied by sensor and task. The timing of the repeated request, one month or six months after the initial request, did not have a significant effect on willingness.
Chapter
Longitudinal surveys have tended to have a strong focus on primary data collection, with most of the analytical information obtained directly from interviews. The next generation of longitudinal studies will likely make extensive use of linked data to augment survey responses. However, this will occur in the context of declining response rates to surveys and in consent to data linkage. Surveys on probability based online panels have several advantages in that they are cheaper and faster to implement than traditional longitudinal surveys but also have the capacity to deliver complex survey instruments. There are different types of information linked to survey data, with consent usually asked for specific sources of data. Evidence from observational studies where consent to link different types of data has been requested report consistent patterns of consent rates with consent to link highest for education records followed by health, with income or economic records having substantially lower levels of consent.
Article
Full-text available
Numerous surveys link interview data to administrative records, conditional on respondent consent, in order to explore new and innovative research questions. Optimizing the linkage consent rate is a critical step toward realizing the scientific advantages of record linkage and minimizing the risk of linkage consent bias. Linkage consent rates have been shown to be particularly sensitive to certain design features, such as where the consent question is placed in the questionnaire and how the question is framed. However, the interaction of these design features and their relative contributions to the linkage consent rate have never been jointly studied, raising the practical question of which design feature (or combination of features) should be prioritized from a consent rate perspective. We address this knowledge gap by reporting the results of a placement and framing experiment embedded within separate telephone and Web surveys. We find a significant interaction between placement and framing of the linkage consent question on the consent rate. The effect of placement was larger than the effect of framing in both surveys, and the effect of framing was only evident in the Web survey when the consent question was placed at the end of the questionnaire. Both design features had negligible impact on linkage consent bias for a series of administrative variables available for consenters and non-consenters. We conclude this research note with guidance on the optimal administration of the linkage consent question.
Article
Full-text available
Administrative records are increasingly being linked to survey records to highten the utility of the survey data. Respondent consent is usually needed to perform exact record linkage; however, not all respondents agree to this request and several studies have found significant differences between consenting and non-consenting respondents on the survey variables. To the extent that these survey variables are related to variables in the administrative data, the resulting administrative estimates can be biased due to non-consent. Estimating non-consent biases for linked administrative estimates is complicated by the fact that administrative records are typically not available for the non-consenting respondents. The present study can overcome this limitation by utilizing a unique data source, the German Panel Study "Labour Market and Social Security" (PASS), and linking the consent indicator to the administrative records (available for the entire sample). This situation permits the estimation of non-consent biases for administrative variables and avoids the need to link the survey responses. The impact of nonconsent bias can be assessed relative to other sources of bias (nonresponse, measurement) for several administrative estimates. The results show that non-consent biases are present for few estimates, but are generally small relative to other sources of bias.
Article
Full-text available
Record linkage is becoming more important as survey budgets are tightening while at the same time demands for more statistical information are rising. Not all respondents consent to linking their survey answers to administrative records, threatening inferences made from linked data sets. So far, several studies have identified respondent-level attributes that are correlated with the likelihood of providing consent (e.g., age, education), but these factors are outside the control of the survey designer. In the present study three factors that are under the control of the survey designer are evaluated to assess whether they impact respondents' likelihood of linkage consent: 1) the wording of the consent question; 2) the placement of the consent question and; 3) interviewer attributes (e.g., attitudes toward data sharing and consent, experience, expectations). Data from an experiment were used to assess the impact of the first two and data from an interviewer survey that was administered prior to the start of data collection are used to examine the third. The results show that in a telephone setting: 1) indicating time savings in the wording of the consent question had no effect on the consent rate; 2) placement of the consent question at the beginning of the questionnaire achieved a higher consent rate than at the end and; 3) interviewers' who themselves would be willing to consent to data linkage requests were more likely to obtain linkage consent from respondents.
Book
The environment for obtaining information and providing statistical data for policy makers and the public has changed significantly in the past decade, raising questions about the fundamental survey paradigm that underlies federal statistics. New data sources provide opportunities to develop a new paradigm that can improve timeliness, geographic or subpopulation detail, and statistical efficiency. It also has the potential to reduce the costs of producing federal statistics. The panel’s first report described federal statistical agencies’ current paradigm, which relies heavily on sample surveys for producing national statistics, and challenges agencies are facing; the legal frameworks and mechanisms for protecting the privacy and confidentiality of statistical data and for providing researchers access to data, and challenges to those frameworks and mechanisms; and statistical agencies access to alternative sources of data. The panel recommended a new approach for federal statistical programs that would combine diverse data sources from government and private sector sources and the creation of a new entity that would provide the foundational elements needed for this new approach, including legal authority to access data and protect privacy. This second of the panel’s two reports builds on the analysis, conclusions, and recommendations in the first one. This report assesses alternative methods for implementing a new approach that would combine diverse data sources from government and private sector sources, including describing statistical models for combining data from multiple sources; examining statistical and computer science approaches that foster privacy protections; evaluating frameworks for assessing the quality and utility of alternative data sources; and various models for implementing the recommended new entity. Together, the two reports offer ideas and recommendations to help federal statistical agencies examine and evaluate data from alternative sources and then combine them as appropriate to provide the country with more timely, actionable, and useful information for policy makers, businesses, and individuals. © 2017 by the National Academy of Sciences. All rights reserved.
Article
An Introduction to Survey Participation. A Conceptual Framework for Survey Participation. Data Resources for Testing Theories of Survey Participation. Influences on the Likelihood of Contact. Influences of Household Characteristics on Survey Cooperation. Social Environmental Influences on Survey Participation. Influences of the Interviewers. When Interviewers Meet Householders: The Nature of Initial Interactions. Influences of Householder-Interviewer Interactions on Survey Cooperation. How Survey Design Features Affect Participation. Practical Survey Design Acknowledging Nonresponse. References. Index.
Article
Using newly available paradata, this article explores the use of "doorstep concerns" to predict interim and final refusals in the National Health Interview Survey (NHIS). Using ten weeks of automated contact history records, we analyze privacy and burden concerns but also examine other verbal and nonverbal interactions recorded by interviewers during contact with households. We conduct a multi-model multinomial logit analysis starting with a social environmental model (e.g., region, urbanicity), followed by the addition of process variables (e.g., number of noncontacts, mode of contact), and finally include the new household-level doorstep concerns (e.g., privacy concerns, too busy). The study found that the doorstep concerns greatly improved models predicting nonresponse relative to models including only environmental variables and basic contact history measures. Privacy concerns were significant in predicting interim refusals, but not final refusals. The effects of burden differed depending upon the particular doorstep concern used as an indicator of burden.
Article
We discuss the cognitive and the psy- chophysical determinants of choice in risky and risk- less contexts. The psychophysics of value induce risk aversion in the domain of gains and risk seeking in the domain of losses. The psychophysics of chance induce overweighting of sure things and of improbable events, relative to events of moderate probability. De- cision problems can be described or framed in multiple ways that give rise to different preferences, contrary to the invariance criterion of rational choice. The pro- cess of mental accounting, in which people organize the outcomes of transactions, explains some anomalies of consumer behavior. In particular, the acceptability of an option can depend on whether a negative outcome is evaluated as a cost or as an uncompensated loss. The relation between decision values and experience values is discussed. Making decisions is like speaking prose—people do it all the time, knowingly or unknowingly. It is hardly surprising, then, that the topic of decision making is shared by many disciplines, from mathematics and statistics, through economics and political science, to sociology and psychology. The study of decisions ad- dresses both normative and descriptive questions. The normative analysis is concerned with the nature of rationality and the logic of decision making. The de- scriptive analysis, in contrast, is concerned with peo- ple's beliefs and preferences as they are, not as they should be. The tension between normative and de- scriptive considerations characterizes much of the study of judgment and choice. Analyses of decision making commonly distin- guish risky and riskless choices. The paradigmatic example of decision under risk is the acceptability of a gamble that yields monetary outcomes with specified probabilities. A typical riskless decision concerns the acceptability of a transaction in which a good or a service is exchanged for money or labor. In the first part of this article we present an analysis of the cog- nitive and psychophysical factors that determine the value of risky prospects. In the second part we extend this analysis to transactions and trades. Risky Choice Risky choices, such as whether or not to take an umbrella and whether or not to go to war, are made without advance knowledge of their consequences. Because the consequences of such actions depend on uncertain events such as the weather or the opponent's resolve, the choice of an act may be construed as the acceptance of a gamble that can yield various out- comes with different probabilities. It is therefore nat- ural that the study of decision making under risk has focused on choices between simple gambles with monetary outcomes and specified probabilities, in the hope that these simple problems will reveal basic at- titudes toward risk and value. We shall sketch an approach to risky choice that