accuracy and precision of the whole experimental setup (web
application, web browser, operating system, and hardware).
These best practices should also immediately be included in
curricula in psychology and other behavioral and social sci-
ences, as students are often conducting web-based experi-
ments and will be future researchers (Krantz & Reips 2017).
Because the proposed web techniques had not been assessed
in previous studies on the accuracy of web applications under
high-resolution timing requirements (de Leeuw & Motz, 2016;
Garaizar, Vadillo, & López-de-Ipiña, 2014;Reimers&Stewart,
2015), the studies and detailed guidelines presented in this arti-
cle can help behavioral researchers who take them into account
when developing their web-based experiments.
In the old days of Internet-based experimenting, technology
was simpler. The effects of new technologies were easier to spot
for researchers who began using the Internet. In fact, one of us
(Reips) has long advocated a Blow-tech principle^in creating
Internet-based research studies, because, early on, technology
was shown to interfere with participants’behavior in Internet-
based experiments. For example, Schwarz and Reips (2001)
created the very same web experiment both with server-side
observed significantly larger and increasing dropout rates in
the latter version. Buchanan and Reips (2001)further
established that technology preferences depend on a partici-
pant’s personality and may thus indirectly bias sample compo-
sition, and consequently behavior, in Internet-based research
studies (even though this seems to be less the case for different
operating systems on smartphones; see Götz, Stieger, & Reips,
2017). Modern web browsers have evolved to handle a much
wider range of technologies that, on the one hand, are capable of
achieving much more accuracy and precision in the control of
loading and rendering content than were earlier browsers, but on
the other hand, are increasingly likely to fall victim to insuffi-
cient optimization of complexity. Unbeknownst to many re-
searchers, vendors of web browsers implement a multitude of
technologies that are geared toward the optimization of goals
(e.g., speed) that are not in line with those of science (e.g.,
quality, timing). In the present article we have empirically shown
that this conflict has an effect on display and timing in Internet-
based studies and provided recommendations and scripts that
researchers can and should use to optimize their studies.
Alternatively—and this may be the only general rule of thumb
we are able to offer as an outcome of the empirical investigation
presented here—they might follow the Blow-tech principle^as
much as possible, to minimize interference.
Author note Support for this research was provided by the
Departamento de Educación, Universidades e Investigación
of the Basque Government (Grant No. IT1078-16) and by
the Committee on Research at the University of Konstanz.
The authors declare that there was no conflict of interest in
the publication of this study.
Bamberg, W. (2018b). Animating CSS properties. MDN web docs.
Retrieved from https://developer.mozilla.org/en-US/docs/Tools/
Barnhoorn, J. S., Haasnoot, E., Bocanegra, B. R., & van Steenbergen, H.
(2015). QRTEngine: An easy solution for running online reaction
time experiments using Qualtrics. Behavior Research Methods,47,
Belshe,M.,Peon,R.,Thomson,M.(2015). Hypertext Transfer Protocol
Version 2 (HTTP/2). Retrieved from https://http2.github.io/http2-spec/
Birnbaum, M. H. (2004). Human research and data collection via the
Internet. Annual Review of Psychology,55,803–832. https://doi.
Buchanan, T., & Reips, U.-D. (2001). Platform-dependent biases in on-
line research: Do Mac users really think different? In K. J. Jonas, P.
Breuer, B. Schauenburg, & M. Boos (Eds.), Perspectives on Internet
research: Concepts and methods. Available at http://www.uni-
Accessed 26 Sept 2018
Garaizar, P., Vadillo, M. A., & López-de-Ipiña, D. (2014). Presentation
accuracy of the web revisited: Animation methods in the HTML5
era. PLoS ONE,9, e109812. https://doi.org/10.1371/journal.pone.
Götz, F. M., Stieger, S., & Reips, U.-D. (2017). Users of the main
smartphone operating systems (iOS, Android) differ only little in
personality. PLoS ONE,12, e0176921. https://doi.org/10.1371/
Grigorik, I., & Weiss, Y. (2018). W3C Preload API. Retrieved from
Henninger, F., Mertens, U. K., Shevchenko, Y., & Hilbig, B. E. (2017).
lab.js: Browser-based behavioral research. https://doi.org/10.5281/
Honing, H., & Reips, U.-D. (2008). Web-based versus lab-based studies:
A response toKendall (2008). Empirical Musicology Review,3,73–
Krantz, J., & Reips, U.-D. (2017). The state of web-based research: A
survey and call for inclusion in curricula. Behavior Research
Methods,49, 1621–1629. https://doi.org/10.3758/s13428-017-0882-x
Kyöstilä, S. (2018). Clamp performance.now() to 100us. Retrieved from
ioral experiments in a Web browser. Behavior Research Methods,
de Leeuw, J. R., & Motz, B. A. (2016). Psychophysics in a Web browser?
Psychophysics Toolbox in a visual search task. Behavior Research
Lewis, P. (2018). Rendering performance. Retrieved from https://
Mangan, M., & Reips, U.-D. (2007). Sleep, sex, and the Web: Surveying
the difficult-to-reach clinical population suffering from sexsomnia.
Behavior Research Methods,39,233–236. https://doi.org/10.3758/
Mozilla. (2018). Concurrency model and Event Loop. MDN web docs.
Retrieved from https://developer.mozilla.org/en-US/docs/Web/
Musch, J., & Reips, U.-D. (2000). A brief history of Web experimenting.
In M. H. Birnbaum (Ed.), Psychological experiments on the Internet
(pp. 61–88). San Diego: Academic Press. https://doi.org/10.1016/