Erratum to: "Social desirability effects on computerized and paper-and-pencil questionnaires" [Computers in Human Behavior 23 (2007) 463-477].
ABSTRACT The study presented in this paper sought to explore several dimensions to online learning. Identifying the dimensions to online learning entails important basic issues which are of great relevance to educators today. The primary question is ''what are ...
- SourceAvailable from: Timothy W Bickmore[Show abstract] [Hide abstract]
ABSTRACT: Research on social responses to computers often assesses only first-impression reactions during a single experimental session, providing limited knowledge about the lasting effect of the results. In this work, we assess the lasting strength of social desirability bias effects on an interface designed to track exercise, manipulated to have high or low personalization (text vs. anthropomorphic conversational character). After 40 days of daily interactions by 25 participants, we found that self-reported exercise was more accurate when reported to the character vs. text. We also find that, for both conditions, participants' decision to initiate a session is greater when they have done more exercise. Moreover, we show that this effect significantly increases over time for participants in the character condition, and decreases for participants in the text condition. This study demonstrates that Media Equation effects can grow stronger or weaker over time, depending upon the presentation of the interface.Proceedings of the International Conference on Human Factors in Computing Systems, CHI 2011, Vancouver, BC, Canada, May 7-12, 2011; 01/2011
- [Show abstract] [Hide abstract]
ABSTRACT: Although there are advantages for use of internet based survey research over other formats, there remains in question whether survey mode influences the data measurement equivalency. While most research exploring survey format finds little or no difference in measurement equivalency, the interaction of sensitive topics and survey modality is not fully understood. Additionally, research suggests gender differences in item response on sensitive topics. The present study examined archival data from a college health survey using both online and paper–pencil survey formats. The interaction was evaluated between gender, survey format, and item sensitivity level. Results indicate that question topic sensitivity has a large effect on missing data, and survey format has a moderate effect. These findings have necessary implications for survey design and outcome interpretations.Computers in Human Behavior 01/2012; 28:251-256. · 2.27 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: In 2002 the Criminal Records Bureau (CRB) was established in the United Kingdom to help ensure safer recruitment decisions could be made for posts involving vulnerable persons. Specifically, the CRB may be utilised to facilitate safer recruitment decision-making practices by providing employers with wider access to an applicant's criminal record information through a disclosure service. The disclosure service consists of both standard and enhanced checks, giving details of an applicant's personal and criminal record information. However, how these checks are impacting upon recruitment decisions is yet to be examined. In order to identify how recruitment decisions are being made based upon CRB information, a computerised data collection tool based on the Decision Board Analysis Technique (DBA) entitled Survey Software Version 5 has been created. This software has the ability to administer a series of questionnaires and surveys. In doing so, data is simultaneously recorded, sorted and processed at the time of input. Databases, files, reports and forms are incorporated within the program, expanding its functionality. Further, an in-built security system, including a password protected interface, helps to ensure that all information that is entered into the software can be kept confidential. The focus of this paper is to explore the creation of Survey Software Version 5 as a tool in data collection surrounding the information utilised in the recruitment decision-making process.WSEAS Transactions on Information Science and Applications 05/2011; 8(5):203-212.
Erratum to: ‘‘Social desirability effects on
computerized and paper-and-pencil
questionnaires’’ [Computers in Human
Behavior 23 (2007) 463–477]
Stephanie Booth-Kewleya,*, Gerald E. Larsona,
Dina K. Miyoshib
aNaval Health Research Center, Behavioral Science and Epidemiology Department, P.O. Box 85122,
San Diego, CA 92186-5122, USA
bSan Diego Mesa College, Psychology Department, 7250 Mesa College Dr., San Diego, CA 92111, USA
Available online 20 October 2006
It has come to the publisher’s attention that there was an error in the final paragraph of
the ‘Results’ section of this paper. Readers please note that the text should read:
The average amount of missing data was 1.16 items for the computer respondents and
0.35 items for the paper and pencil respondents.
0747-5632/$ - see front matter Published by Elsevier Ltd.
DOI of original article: 10.1016/j.chb.2004.10.020
E-mail address: firstname.lastname@example.org (S. Booth-Kewley).
Computers in Human Behavior 23 (2007) 2093