Project

data collection using smartphones

Updates
0 new
0
Recommendations
0 new
0
Followers
0 new
18
Reads
0 new
120

Project log

Peter Lugtig
added 4 research items
The increasing volume of "Big Data" produced by sensors and smart devices can transform the social and behavioral sciences. Several successful studies used digital data to provide new insights into social reality. This special issue argues that the true power of these data for the social sciences lies in connecting new data sources with surveys. While new digital data are rich in volume, they seldomly cover the full population nor do they provide insights into individuals' feelings, motivations, and attitudes. Conversely, survey data, while well suited for measuring people's internal states, are relatively poor at measuring behaviors and facts. Developing a methodology for integrating the two data sources can mitigate their respective weaknesses. Sensors and apps on smartphones are useful for collecting both survey data and digital data. For example, smartphones can track people's travel behavior and ask questions about its motives. A general methodology on the augmentation of surveys with data from sensors and apps is currently missing. Issues of representativeness, processing, storage, data linkage, and how to combine survey data with sensor and app data to produce one statistic of interest pertain. This editorial to the special issue on "Using Mobile Apps and Sensors in Surveys" provides an introduction to this new field, presents an overview of challenges , opportunities, and sets a research agenda. We introduce the four papers in this special issue that focus on these opportunities and challenges and provide practical applications and solutions for integrating sensor-and app-based data collection into surveys.
Invitation letters to web surveys often contain information on how long it will take to complete a web survey. When the stated length in an invitation of a survey is short, it could help to convince respondents to participate in the survey. When it is long respondents may choose not to participate, and when the actual length is longer than the stated length there may be a risk of dropout. This paper reports on an Randomised Control Trial (RCT) conducted in a cross-sectional survey conducted in the Netherlands. The RCT included different version of the stated length of a survey and inclusion of a Quick Response (QR) code as ways to communicate to potential respondents that the survey was short or not. Results from the RCT show that there are no effects of the stated length on actual participation in the survey, nor do we find an effect on dropout. We do however find that inclusion of a QR code leads respondents to be more likely to use a smartphone, and find some evidence for a different composition of our respondent sample in terms of age.
Advances in smartphone technology have allowed for individuals to have access to near-continuous location tracking at a very precise level. As the backbone of mobility research, the Travel Diary Study, has continued to offer decreasing response rates over the years, researchers are looking to these mobile devices to bridge the gap between self-report recall studies and a person’s underlying travel behavior. This article details an open-source application that collects real-time location data which respondents may then annotate to provide a detailed travel diary. Results of the field test involving 674 participants are discussed, including technical performance, data quality and response rate.
Peter Lugtig
added a research item
Online surveys are increasingly completed on smartphones. There are several ways to structure online surveys so as to create an optimal experience for any screen size. For example, communicating through applications (apps) such as WhatsApp and Snapchat closely resembles natural turn-by-turn conversations between individuals. Web surveys currently mimic the design of paper questionnaires mostly, leading to a survey experience that may not be optimal when completed on smartphones. In this paper, we compare a research messenger design, which mimics a messenger app type of communication, to a responsive survey design. We investigate whether response quality is similar between the two designs and whether respondents' satisfaction with the survey is higher for either version. Our results show no differences for primacy effects, number of nonsubstantive answers, and dropout rate. The length of open-ended answers was shorter for the research messenger survey compared to the responsive design, and the overall time of completion was longer in the research messenger survey. The evaluation at the end of the survey showed no clear indication that respondents liked the research messenger survey more than the responsive design. Future research should focus on how to optimally design online mixed-device surveys in order to increase respondent satisfaction and data quality.
Peter Lugtig
added a research item
In this paper we discuss the implications of using mobile devices for online survey completion. With more and more people accessing online surveys on mobile devices, online surveys need to be redesigned in order to be able to meet the characteristics of mobile device usage, such as small screens and short messaging. We discuss mobile friendly design by focussing on survey layout, the length of the survey, special features, and the decision of making the survey app or browser based. Further, we discuss the different sensors that can be used to augment or replace survey questions, and respondents’ willingness to share sensor data. We end with three examples of surveys conducted by Statistics Netherlands, where sensors are used for active and passive measurement in mobile surveys.
Peter Lugtig
added a research item
Smartphones enable passive collection of sensor data alongside survey participation. Location data add context to people’s reports about their time use. In addition, linking global positioning system data to self-reported time use surveys (TUSs) can be valuable for understanding how people spend their time. This article investigates whether and how passive collection of geographical locations (coordinates) proves useful for deriving respondents’ functional locations. Participants of the ongoing Children of Immigrants Longitudinal Survey in the Netherlands were invited to participate in a TUS administered with a smartphone app that also unobtrusively tracked respondents’ locations. Respondents reported their activities per 10-min interval in a smartphone diary app ( n = 1,339) and shared their geographical location data ( n = 1,264). The correspondence between the functional locations derived from the time use data and those derived from the geographical location data was assessed by calculating the percentage of intervals in which both measures are similar. Overall, results show that home locations can be automatically assigned reliably but that respondent information is required to reliably assign work or school locations. In addition, location tracking data contain many measurement errors, making it difficult to record valid locations. Multilevel models show that the variability in correct classifications is intrapersonal and largely predicted by phone type, which determines location measurement frequency.
Peter Lugtig
added a research item
In this study, we investigate whether mobile device use in surveys can be predicted. We aim to identify possible motives for device use and build a model by drawing on theory from technology acceptance research and survey research. We then test this model with a Structural Equation Modeling approach using data of seven waves of the GESIS panel. We test whether our theoretical model fits the data by focusing on measures of fit, and by studying the standardized effects of the model. Results reveal that intention to use a particular device can predict actual use quite well. Ease of smartphone use is the most meaningful variable: if people use a smartphone for specific tasks, their intention to use a smartphone for survey completion is also more likely. In conclusion, investing in ease of use of mobile survey completion could encourage respondents to use mobile devices. This can foremost be established by building well-designed surveys for mobile devices.
Peter Lugtig
added a research item
A sizable minority of all web surveys are nowadays completed on smartphones. People who choose a smartphone for Internet-related tasks are different from people who mainly use a PC or tablet. Smartphone use is particularly high among the young and urban. We have to make web surveys attractive for smartphone completion in order not to lose these groups of smartphone users. In this paper we study how to encourage people to complete surveys on smartphones in order to attract hard-to-reach subgroups of the population. We experimentally test new features of a survey-friendly design: we test two versions of an invitation letter to a survey, a new questionnaire lay-out, and autoforwarding. The goal of the experiment is to evaluate whether the new survey design attracts more smartphone users, leads to a better survey experience on smartphones and results in more respondents signing up to become a member of a probability-based online panel. Our results show that the invitation letter that emphasizes the possibility for smartphone completion does not yield a higher response rate than the control condition, nor do we find differences in the socio-demographic background of respondents. We do find that slightly more respondents choose a smartphone for survey completion. The changes in the layout of the questionnaire do lead to a change in survey experience on the smartphone. Smartphone respondents need 20% less time to complete the survey when the questionnaire includes autoforwarding. However, we do not find that respondents evaluate the survey better, nor are they more likely to become a member of the panel when asked at the end of the survey. We conclude with a discussion of autoforwarding in web surveys and methods to attract smartphone users to web surveys.
Peter Lugtig
added a research item
With the rise of mobile surveys comes the need for shorter questionnaires. We investigate the modularization of an existing questionnaire in the LISS Panel in the Netherlands. We randomly divided respondents into a normal length survey condition, a condition where the same survey was split into 3 parts, and a condition where the survey was split into 10 parts. Respondents received the parts consecutively at regular intervals over a 1-month period. We discuss response rates, data quality measures, and respondents’ evaluation of the questionnaire. Our results indicate higher start rates when the survey is cut into smaller parts but also higher dropout rates. However, the fraction of missing information is lower in the 3- and 10-parts conditions. More respondents use their mobile phone for survey completion when the survey is shorter. We find fewer item missings and satisficing in shorter surveys. We find no effect on neutral and extreme responding, nor on estimates of the validity of answers. People with low and high education and young and old evaluate shorter surveys better than the normal length survey.
Anne Elevelt
added a research item
The increasing use of smartphones opens up opportunities for novel ways of survey data collection, but also poses new challenges. Collecting more and different types of data means that studies can become increasingly intrusive. We risk over-asking participants, leading to nonresponse. This study documents nonresponse and nonresponse bias in a smartphone-only version of the Dutch Time Use Survey (TUS). Respondents from the Dutch LISS panel were asked to perform five sets of tasks to complete the whole TUS: 1) accept an invitation to participate in the study and install an app, 2) fill out a questionnaire on the web, 3) participate in the smartphone time use diary on their smartphone, 4) answer pop-up questions and 5) give permission to record sensor data (GPS locations and call data). Results show that 42.9% of invited panel members responded positively to the invitation to participate in a smartphone survey. However, only 28.9% of these willing panel members completed all stages of the study. Predictors of nonresponse are somewhat different at every stage. In addition, respondents who complete all smartphone tasks are different from groups who do not participate at some or any stage of the study. By using data collected in previous waves we show that nonresponse leads to nonresponse bias in estimates of time use. We conclude by discussing implications for using smartphone apps in survey research.
Peter Lugtig
added 4 research items
This article reports from a pilot study that was conducted in a probability-based online panel in the Netherlands. Two parallel surveys were conducted: one in the traditional questionnaire layout of the panel and the other optimized for mobile completion with new software that uses a responsive design (optimizes the layout for the device chosen). The latter questionnaire was optimized for mobile completion, and respondents could choose whether they wanted to complete the survey on their mobile phone or on a regular desktop. Results show that a substantive number of respondents (57%) used their mobile phone for survey completion. No differences were found between mobile and desktop users with regard to break offs, item nonresponse, time to complete the survey, or response effects such as length of answers to an open-ended question and the number of responses in a check-all-that-apply question. A considerable number of respondents gave permission to record their GPS coordinates, which are helpful in defining where the survey was taken. Income, household size, and household composition were found to predict mobile completion. In addition, younger respondents, who typically form a hard-to-reach group, show higher mobile completion rates.
Respondents in an Internet panel survey can often choose which device they use to complete questionnaires: a traditional PC, laptop, tablet computer, or a smartphone. Because all these devices have different screen sizes and modes of data entry, measurement errors may differ between devices. Using data from the Dutch Longitudinal Internet Study for the Social sciences panel, we evaluate which devices respondents use over time. We study the measurement error associated with each device and show that measurement errors are larger on tablets and smartphone than on PCs. To gain insight into the causes of these differences, we study changes in measurement error over time, associated with a switch of devices over two consecutive waves of the panel. We show that within individuals, measurement errors do not change with a switch in device. Therefore, we conclude that the higher measurement error in tablets and smartphones is associated with self-selection of the sample into using a particular device.
Survey research is changing in a more rapid pace than ever before, and the continuous and exponential growth in technological developments is not likely to slow down. Online surveys are now being completed on a range of different devices: PC, laptops, tablets, mobile phones or hybrids between these devices. Each device varies in screen sizes, modes of operationalization and technological possibilities. We define online surveys that are in practice being completed on different devices as mixed-device surveys. This special issue discusses issues in the design and implementation of mixed-device surveys, with the aim to bring survey research to the next level: in our view all web surveys should from now be thought of as mixed-device surveys. Theory and best practices for mixed-device surveys are still in its infancy. The current state of knowledge about the dynamics of taking surveys on mobile devices is not as advanced as necessary in times of rapid change. While current technology opens great possibilities to collect data via text, apps, and visuals, there is little scientific research published about the actual uses and best practices of these applications to increase data quality. Researchers and survey methodologists in particular need to find ways to keep up with fast changing technologies.