PresentationPDF Available

RAPTOR (Research in Applied Psychology for Teaching: Open Repository)

Authors:
  • Humboldt University Berlin / University of Applied Management / IQP

Abstract

Important facts about all stages of the research process in applied psychological research. Based on a collection of scientific sources for further reading. One slide is dedicated to each source. In addition, there is a link in the slide set to a collection of several hundred further sources on philosophy of science, research process, methodology and statistics. This collection is continuously updated. While there is a saying that professors would rather share their toothbrushes than their slides, I firmly believe that this does not get us anywhere. Therefore: Feel free to use the slides for your own teaching and for the supervision of your students. I am very happy to receive feedback, corrections and additions: jens.nachtwei@hu-berlin.de
RAPTOR
Research in Applied Psychology
for Teaching: Open Repository
<> Jens Nachtwei <> v1.0, 2023 <>
© Jens Nachtwei
pictures:
Use, modification and distribution of the slides only with the written permission of the copyright holder.
pixabay
picture: pixabay
2
picture: pixabay
1. About us
2. Why this session?
3. Our topics
4. How to crack it?
Roadmap
5. Q & A
© Jens Nachtwei
3
About us
picture: pixabay
(1)
About Jens Nachtwei
*1979 in Stralsund, raised in East Berlin, married, father of two
Psychologist (Diploma) since 2006 (HU Berlin)
Predoc, Engineering Psychology 2006 - 2010 (HU Berlin)
Postdoc, Social & Organisational Psychology 2011 - 2022 (HU Berlin)
Professor of Business Psychology, University of Applied Management since 2012
Guest Professor of Engineering Psychology, HU Berlin 2022 - 2025
Founder & CEO, IQP (University Spin-off, formerly GCC) since 2007
Main interest: future of work (driven by automation)
4
© Jens Nachtwei
picture: © Jens Nachtwei
What about you?
5
© Jens Nachtwei
picture: pixabay; DALL:E
6
Why this session?
picture: pixabay
(2)
broad and critical orientation
You have your thesis ahead of you.
The world of research is big + challenging!
So the objectives of this session are:
You know the standards & trends.
You think critically and interdisciplinarily.
You gain orientation for your thesis.
You enjoy the contents and discussion!
7
© Jens Nachtwei
picture: DALL:E
8
Bild: pixabay
(3)Our topics
overview
Overall: philosophy of science, process
Introduction: literature review, theory
Question: research question, hypotheses
Methods: overview, sample, examples
Results: descriptives, tests, significance, effects
Discussion: implications, biases, reflection
Third Mission: impact, role of researcher
Publication: writing, writing, writing 9
© Jens Nachtwei
OVERVIEW
picture: DALL:E
Meet RAPTOR
(Research in Applied Psychology
for Teaching: Open Repository)
10
© Jens Nachtwei
picture: DALL:E
visit RAPTOR
citations: ResearchGate 12/22 - 01/23
philosophy of science,
research process
What holds the world of science together
at its core?
What should a research process ideally
look like and what is the reality?
How can the general principles be applied
to psychology as a discipline?
11
© Jens Nachtwei
OVERALL
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
The Rules of the Game Called
Psychological Science
“... dominant rule …: collect results that are
statistically significant.”
“... typical studies are insufficiently powerful
“high rates of false positives
“need for sufficiently powerful replications and
changes in journal policies.”
12
© Jens Nachtwei
picture: DALL:E; google fonts
Bakker, M., van Dijk, A., & Wicherts, J.
M. (2012). The Rules of the Game
Called Psychological Science.
Perspectives on Psychological
Science, 7(6), 543–554.
https://doi.org/ 10.1177/17456916124590
60
p. 543
article citations:
626
journal impact factor:
8.190
author(s) to follow:
Jelte M. Wicherts
Degrees of Freedom in Planning,
Running, Analyzing, and
Reporting Psychological Studies:
“many choices that are often arbitrary
“... we present an extensive list of 34 degrees of
freedom that researchers have in formulating
hypotheses, and in designing, running, analyzing,
and reporting of psychological research.”
“... a checklist to assess the quality of
preregistrations…”. 13
© Jens Nachtwei
picture: DALL:E; google fonts
Wicherts, J. M., Veldkamp, C. L. S.,
Augusteijn, H. E. M., Bakker, M., van
Aert, R. C. M., & van Assen, M. A. L. M.
(2016). Degrees of Freedom in
Planning, Running, Analyzing, and
Reporting Psychological Studies: A
Checklist to Avoid p-Hacking.
Frontiers in Psychology, 7(e124),
Article 1832.
https://doi.org/ 10.3389/fpsyg.2016.018
32
p. 1
article citations:
459
journal impact factor:
4.232
author(s) to follow:
Jelte M. Wicherts
Ideology in Work and
Organizational Psychology: The
Responsibility of the Researcher
“... (1) when, where and how does neoliberalism
manifest in society and our work as Work and
Organizational Psychologists, ...
“(2) what is our duty as work and organizational
psychologists towards society and our own work,
…“
14
© Jens Nachtwei
picture: DALL:E; google fonts
Dóci, E., & Bal, P. M. (2018). Ideology in
Work and Organizational
Psychology: The Responsibility of the
Researcher. European Journal of
Work and Organizational
Psychology, 27(5), 558-560.
https://doi.org/ 10.1080/1359432X.2018.
1515201
p. 558
article citations:
12
journal impact factor:
3.968
author(s) to follow:
P. Matthijs Bal
Actionable Recommendations
for Narrowing the Science-
Practice Gap in Open Science
“... open-science practices … driven by a need to
reduce questionable research practices…
“... we offer 10 actionable recommendations for
narrowing the specific science-practice gap in
open science.”
“we hope to encourage more transparent,
credible, and reproducible research…” 15
© Jens Nachtwei
picture: DALL:E; google fonts
Aguinis, H., Banks, G. C., Rogelberg, S.
G., & Cascio, W. F. (2020).
Actionable Recommendations for
Narrowing the Science-Practice Gap
in Open Science. Organizational
Behavior and Human Decision
Processes, 158, 27-35.
https://doi.org/ 10.1016/j.obhdp.2020.0
2.007
p. 27
article citations:
38
journal impact factor:
4.941
author(s) to follow:
Wayne F. Cascio
The Natural Selection of Bad
Science
“The persistence of poor methods results partly
from incentives that favor them, leading to the
natural selection of bad science.”
“... successful labs produce more "progeny", such
that their methods are more often copied …
Selection for high output leads to poorer methods
and increasingly high false discovery rates.
16
© Jens Nachtwei
picture: DALL:E; google fonts
Smaldino, P. E., & McElreath, R. (2016).
The Natural Selection of Bad Science.
Royal Society Open Science, 3, Article
160384.
http://doi.org/ 10.1098/rsos.160384
p. 1
article citations:
569
journal impact factor:
3.653
author(s) to follow:
Paul E. Smaldino
literature review, theory
How do I even find sound and appropriate
literature sources?
How exactly do I go about summarising
the literature in my topic area?
What is the value of theories and how
should I integrate them into my thesis?
17
© Jens Nachtwei
INTRODUCTION
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Which Academic Search Systems
are Suitable for Systematic
Reviews …”
“... sample selection of relevant studies
determines a review's outcome, validity, and
explanatory power.”
“This study investigates and compares the
systematic search qualities of 28 widely used
academic search systems, …”
18
© Jens Nachtwei
picture: DALL:E; google fonts
Gusenbauer, M., & Haddaway, N. R.
(2020). Which Academic Search
Systems are Suitable for Systematic
Reviews or Meta-Analyses?
Evaluating Retrieval Qualities of
Google Scholar, PubMed, and 26
Other Resources. Research Synthesis
Methods, 11(2), 181– 217.
https://doi.org/ 10.1002/jrsm.1378
p. 181
article citations:
487
journal impact factor:
9.308
author(s) to follow:
Michael Gusenbauer
Most Frequently Cited Sources,
Articles, and Authors in
Industrial-Organizational Psy.
“We reviewed and analyzed the 6,654 unique
items (e.g., journal articles, book chapters) …
cited in six popular I-O psychology textbooks.”
“... 39% of the top-cited sources are not
traditional academic peer-reviewed journals, 77%
of the top-cited articles were published in
crossdisciplinary journals…” 19
© Jens Nachtwei
picture: DALL:E; google fonts
Aguinis, H., Ramani, R., Campbell, P.,
Bernal-Turnes, P., Drewry, J., &
Edgerton, B. (2017). Most Frequently
Cited Sources, Articles, and Authors
in Industrial-Organizational
Psychology Textbooks: Implications
for the Science–Practice Divide,
Scholarly Impact, and the Future of
the Field. Industrial and
Organizational Psychology, 10(4),
507-557.
https://doi.org/ 10.1017/iop.2017.69
p. 507
article citations:
43
journal impact factor:
9.375
author(s) to follow:
Herman Aguinis
Analyzing the Past to Prepare for
the Future: Writing a Literature
Review
“... we indicate the broad structure of a review
paper and provide several suggestions on
executing your review.”
“We reflect on … what should be included in the
introduction to your paper? … how can you justify a
proposition?). … we provide examples …” 20
© Jens Nachtwei
picture: DALL:E; google fonts
Webster, J., & Watson, R. T. (2002).
Analyzing the Past to Prepare for the
Future: Writing a Literature Review.
MIS Quarterly, 26(2), xiii–xxiii.
http://www.jstor.org/stable/4132319
p. xv
article citations:
6.876
journal impact factor:
7.198
author(s) to follow:
Richard T. Watson
Advancing Theory with Review
Articles
“We propose a non‐exhaustive set of avenues for
developing theory with a review article:
exposing emerging perspectives, analyzing
assumptions, clarifying constructs,
establishing boundary conditions, testing
new theory, theorizing with systems theory,
and theorizing with mechanisms.”
21
© Jens Nachtwei
picture: DALL:E; google fonts
Post, C., Sarala, R., Gatrell, C., &
Prescott, J. E. (2020). Advancing
Theory with Review Articles. Journal
of Management Studies, 57(2),
351-376.
https://doi.org/ 10.1111/joms.12549
p. 351
article citations:
164
journal impact factor:
9.720
author(s) to follow:
John E. Prescott
Theory (What Is It Good For?)
“If done well, theorizing may be truly astonishing.
It does not have to respond to a specific issue,
particularly in an instrumental way. It can help
readers appreciate phenomena more deeply in
ways they had not imagined before. That is of
great value in itself.”
22
© Jens Nachtwei
picture: DALL:E; google fonts
Bartunek, J. M. (2020). Theory (What
Is It Good For?). Academy of
Management Learning and
Education, 19(2), 223–226.
https://doi.org/ 10.5465/amle.2020.012
0
p. 225
article citations:
8
journal impact factor:
4.373
author(s) to follow:
Jean M. Bartunek
The Power of Process Theories to
Better Understand and Detect
Consequences
“Through extending our current theories to
include a more precise description of relevant
organizational processes, we believe that potential
consequences, intended and unintended, of
interventions will become more apparent.”
23
© Jens Nachtwei
picture: DALL:E; google fonts
Braun, M., Kuljanin, G., Grand, J.,
Kozlowski, S., & Chao, G. (2022). The
Power of Process Theories to Better
Understand and Detect
Consequences of Organizational
Interventions. Industrial and
Organizational Psychology, 15(1),
99-104.
https://doi.org/ 10.1017/iop.2021.125
p. 99
article citations:
0
journal impact factor:
9.375
author(s) to follow:
Georgia T. Chao
Recommendations for Creating
Better Concept Definitions in the
Organizational, Behavioral
“... explain why clear conceptual definitions are
essential for scientific progress and provide a
concrete set of steps that researchers can follow to
improve their conceptual definitions.”
“... we provide some examples that generally meet
the criteria for a good conceptual definition.” 24
© Jens Nachtwei
picture: DALL:E; google fonts
Podsakoff, P. M., MacKenzie, S. B., &
Podsakoff, N. P. (2016).
Recommendations for Creating
Better Concept Definitions in the
Organizational, Behavioral, and
Social Sciences. Organizational
Research Methods, 19(2), 159–203.
https://doi.org/ 10.1177/1094428115624
965
p. 159
article citations:
315
journal impact factor:
8.247
author(s) to follow:
Philip Podsakoff
research question,
hypotheses
Where can research questions come from
and how changeable are they?
How do you derive hypotheses?
How do you assess the quality and value of
hypotheses?
25
© Jens Nachtwei
QUESTION
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Too Close or Optimally
Positioned? The Value of
Personally Relevant Research
“Yet pervading our field and much of social science
is a stigma against engaging in personally relevant
research…”
”We discuss the challenges, opportunities,
implications and importance of personally relevant
research for scholars, reviewers, and society.” 26
© Jens Nachtwei
picture: DALL:E; google fonts
Jones, E. B., & Bartunek, J. M. (2021).
Too Close or Optimally Positioned?
The Value of Personally Relevant
Research. Academy of Management
Perspectives, 35(3), 335–346,
https://doi.org/ 10.5465/amp.2018.000
9
p. 335
article citations:
16
journal impact factor:
8.069
author(s) to follow:
Jean M. Bartunek
I Never Promised You a Rose
Garden: When Research
Questions Ought to Change
“... change can, does and indeed should occur in
response to changes in the context within which
the research is being conducted.”
“Using an illustrative example we identify
refinement and reframing as two distinct types of
research question development.” 27
© Jens Nachtwei
picture: DALL:E; google fonts
MacIntosh, R., Bartunek, J. M., Bhatt,
M., & MacLean, D. (2016). I Never
Promised You a Rose Garden: When
Research Questions Ought to
Change. Research in Organizational
Change and Development, 24,
47-82.
https://doi.org/ 10.1108/S0897-3016201
60000024003
p. 47
article citations:
6
journal impact factor:
-
author(s) to follow:
Jean M. Bartunek
Exploratory Hypothesis Tests Can
be More Compelling than
Confirmatory Hypothesis Tests
“... we … argue that there are several advantages of
exploratory hypothesis tests that can make their
results more compelling than those of
confirmatory hypothesis tests.”
“We also consider some potential disadvantages of
exploratory hypothesis tests…” 28
© Jens Nachtwei
picture: DALL:E; google fonts
Rubin, M., & Donkin, C. (2022).
Exploratory Hypothesis Tests Can be
More Compelling than Confirmatory
Hypothesis Tests. Philosophical
Psychology,
https://doi.org/ 10.1080/09515089.2022
.2113771
remark: not in volume and issue yet
(03.01.23)
p. 1
article citations:
0
journal impact factor:
1.573
author(s) to follow:
Mark Rubin
overview, sample, examples
What research methods are established in
applied psychology?
What exactly is the role of sampling?
What do I need to bear in mind when
conducting questionnaire research?
29
© Jens Nachtwei
METHODS
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Twilight of Dawn or of Evening? A
Century of Research Methods in
the Journal of Applied Psych.…”
“Specifically, we highlight the need to conduct
replications; study the exceptional and not just
the average; improve the quality of the review
process, particularly regarding methodological
issues; emphasize design and measurement
issues; and build and test more specific theories.”
30
© Jens Nachtwei
picture: DALL:E; google fonts
Cortina, J. M., Aguinis, H., & DeShon,
R. P. (2017). Twilight of Dawn or of
Evening? A Century of Research
Methods in the Journal of Applied
Psychology. Journal of Applied
Psychology, 102(3), 274–290.
https://doi.org/ 10.1037/apl0000163
p. 274
article citations:
70
journal impact factor:
11.802
author(s) to follow:
Herman Aguinis
The First 20 Years of
Organizational Research Methods:
…”
“Third, we describe the 27 feature topics (i.e., set
of articles addressing a common issue)...”
“Fifth, we identify the most frequently published
ORM authors (and their disciplinary background)
out of a total of 884 who have published at least
one article.”
31
© Jens Nachtwei
picture: DALL:E; google fonts
Aguinis, H., Ramani, R. S., & Villamor,
I. (2019). The First 20 Years of
Organizational Research Methods:
Trajectory, Impact, and Predictions
for the Future. Organizational
Research Methods, 22(2), 463–489.
https://doi.org/ 10.1177/1094428118786
564
p. 463
article citations:
21
journal impact factor:
8.247
author(s) to follow:
Herman Aguinis
Methodological Checklists for
Improving Research Quality and
Reporting Consistency
“In this commentary, we focus specifically on the
peer review process in academic journals, given
our reviewing and editorial experience. The author
team's experience includes 157 collective years
reviewing for peer-reviewed academic journals, …”
32
© Jens Nachtwei
picture: DALL:E; google fonts
Eby, L., Shockley, K., Bauer, T.,
Edwards, B., Homan, A., Johnson, R.,
Lang, J. W. B., Morris, S. B., & Oswald,
F. (2020). Methodological Checklists
for Improving Research Quality and
Reporting Consistency. Industrial
and Organizational Psychology, 13(1),
76-83.
https://doi.org/ 10.1017/iop.2020.14
p. 76
article citations:
8
journal impact factor:
9.375
author(s) to follow:
Frederick L Oswald
How to Construct a Mixed
Methods Research Design
“We explain the seven major design dimensions:
purpose, theoretical drive, timing
(simultaneity and dependency), point of
integration, typological versus interactive
design approaches, planned versus emergent
design, and design complexity.”
33
© Jens Nachtwei
picture: DALL:E; google fonts
Schoonenboom, J., & Johnson, R.B.
(2017). How to Construct a Mixed
Methods Research Design. KZfSS
Kölner Zeitschrift für Soziologie und
Sozialpsychologie, 69(2), 107–131.
https://doi.org/ 10.1007/s11577-017-045
4-1
p. 107
article citations:
808
journal impact factor:
0.819
author(s) to follow:
R. Burke Johnson
Do Samples Really Matter That
Much?
“In this chapter we examine the history of the
sample generalizability debate and consider
opposing arguments and empirical findings. We
conclude that it is rare in applied behavioral
science for the nature of the sample to be an
important consideration for generalizability.”
34
© Jens Nachtwei
picture: DALL:E; google fonts
Highhouse, S., & Gillespie, J. Z. (2009).
Do samples really matter that much?
In C. E. Lance & R. J. Vandenberg
(Eds.), Statistical and
Methodological Myths and Urban
Legends: Doctrine, Verity and Fable
in the Organizational and Social
Sciences (pp. 247–265).
Routledge/Taylor & Francis Group.
p. 249
article citations:
132
journal impact factor:
-
author(s) to follow:
Scott Highhouse
The Importance of Sample
Composition Depends on the
Research Question
“Together, these two sets of findings bear out our
initial claim that (a) sample composition is critical
when generalizing a sample statistic to a larger
population, yet (b) representativeness may not
always be a concern for theory testing.”
35
© Jens Nachtwei
picture: DALL:E; google fonts
Gillespie, M., Gillespie, J., Brodke, M., &
Balzer, W. (2016). The Importance of
Sample Composition Depends on
the Research Question. Industrial
and Organizational Psychology, 9(1),
207-211.
https://doi.org/ 10.1017/iop.2015.137
p. 209
article citations:
4
journal impact factor:
9.375
author(s) to follow:
William Balzer
The Weirdest People in the
World?
“Behavioral scientists routinely publish broad
claims about human psychology and behavior in
the world's top journals based on samples drawn
entirely from Western, Educated, Industrialized,
Rich, and Democratic (WEIRD) societies.”
36
© Jens Nachtwei
picture: DALL:E; google fonts
Henrich, J., Heine, S., & Norenzayan,
A. (2010). The Eeirdest People in the
World? Behavioral and Brain
Sciences, 33(2-3), 61-83.
https://doi.org/ 10.1017/S0140525X099
9152X
p. 61
article citations:
7.709
journal impact factor:
12.579
author(s) to follow:
-
The Nature of Expertise: A
Review
“... an explanation for the range of interest and
viewpoints, and moves on to a discussion of the
nature and study of expertise.”
“The diversity in definitions, domains, disciplines,
and the impact of these factors on approaches to
investigation, are offered…”
37
© Jens Nachtwei
picture: DALL:E; google fonts
Farrington-Darby, T., & Wilson, J. R.
(2006). The Nature of Expertise: A
review. Applied Ergonomics, 37(1),
17-32.
https://doi.org/ 10.1016/j.apergo.2005.0
9.001
p. 17
article citations:
169
journal impact factor:
3.94
author(s) to follow:
-
Where in the World Are the
Workers? Cultural
Underrepresentation in I-O
“... one issue that should not be overlooked is that
of adequately representing nationality and culture
in I-O research samples.”
38
© Jens Nachtwei
picture: DALL:E; google fonts
Myers, C. (2016). Where in the World
Are the Workers? Cultural
Underrepresentation in I-O
Research. Industrial and
Organizational Psychology, 9(1),
144-152.
https://doi.org/ 10.1017/iop.2015.127
p. 145
article citations:
4
journal impact factor:
9.375
author(s) to follow:
Christopher G. Myers
Improving the Image of
Student-recruited Samples: A
Commentary
“... we discuss how the sampling methods of
student- and non-student-recruited samples can
enhance or diminish external validity and
generalization.”
“Next, we present the advantages of the
student-recruited sampling method…” 39
© Jens Nachtwei
picture: DALL:E; google fonts
Demerouti, E., & Rispens, S. (2014).
Improving the Image of
Student-Recruited Samples: A
Commentary. Journal of
Occupational and Organizational
Psychology, 87(1), 34-41.
https://doi.org/ 10.1111/joop.12048
p. 34
article citations:
177
journal impact factor:
5.119
author(s) to follow:
Evangelia Demerouti
Small is Beautiful: In Defense of
the Small-N Design
“We illustrate the properties of small-N and
large-N designs using a simulated paradigm…”
“Our simulations highlight the high power and
inferential validity of the small-N design, in
contrast to the lower power and inferential
indeterminacy of the large-N design.“
40
© Jens Nachtwei
picture: DALL:E; google fonts
Smith, P. L., & Little, D. R. (2018). Small
is beautiful: In Defense of the
Small-N design. Psychonomic
Bulletin & Review, 25, 2083–2101.
https://doi.org/ 10.3758/s13423-018-145
1-8
p. 2083
article citations:
278
journal impact factor:
4.412
author(s) to follow:
Philip Smith
Participant Motivation: A Critical
Consideration
“Two critical questions include Are participants
who they say they are (e.g., working adults)?” And,
are participants paying attention to the study
instructions and questions and participating with
effort? (...) I also provide suggestions for ways
researchers may address these issues.”
41
© Jens Nachtwei
picture: DALL:E; google fonts
McGonagle, A. (2015). Participant
Motivation: A Critical Consideration.
Industrial and Organizational
Psychology, 8(2), 208-214.
https://doi.org/ 10.1017/iop.2015.27
p. 208
article citations:
10
journal impact factor:
9.375
author(s) to follow:
Alyssa K. McGonagle
Beyond the Ritualized Use of
Questionnaires: Toward a Science
of Actual Behaviors
“Despite, or maybe because of their prevalence,
questionnaire measures are seldom called into
question. Yet, numerous limitations make their
use highly problematic. The following list
summarizes some of the problematic issues: …
42
© Jens Nachtwei
picture: DALL:E; google fonts
Fischer, T., Hambrick, D., Sajons, G., &
Van Quaquebeke, N. (2020). Beyond
the Ritualized Use of Questionnaires:
Toward a Science of Actual Behaviors
and Psychological States. The
Leadership Quarterly,
31(4), Article 101449.
https://doi.org/ 10.1016/S1048-9843(20)
30076-X
p. I
article citations:
23
journal impact factor:
10.517
author(s) to follow:
Niels Van Quaquebeke
Hands-on Guide to Questionnaire
Research: Selecting, Designing,
and Developing your Questionna.”
“In this series we aim to present a practical guide
that will enable research teams to do
questionnaire research that is well designed, well
managed, and non-discriminatory and which
contributes to a generalisable evidence base.”
43
© Jens Nachtwei
picture: DALL:E; google fonts
Boynton, P., & Greenhalgh, T. (2004).
Hands-on Guide to Questionnaire
Research: Selecting, Designing, and
Developing your Questionnaire. BMJ
Clinical research, 328, 1312-1315.
https://doi.org/ 10.1136/bmj.328.7451.13
12
p. 1312
article citations:
821
journal impact factor:
96.216
author(s) to follow:
Trisha Greenhalgh
Will the Questions Ever End?
Person-Level Increases in Careless
Responding …”
“Is there a point within a self-report questionnaire
where participants will start responding
carelessly? If so, then after how many items do
participants reach that point? And what can
researchers do to encourage participants to
remain careful …?” 44
© Jens Nachtwei
picture: DALL:E; google fonts
Bowling, N. A., Gibson, A. M., Houpt, J.
W., & Brower, C. K. (2021). Will the
Questions Ever End? Person-Level
Increases in Careless Responding
During Questionnaire Completion.
Organizational Research Methods,
24(4), 718–738.
https://doi.org/ 10.1177/1094428120947
794
p. 718
article citations:
18
journal impact factor:
8.247
author(s) to follow:
Nathan Bowling
Survey Response Rates: Trends
and a Validity Assessment
Framework
“To better understand the state of the science, we
first analyze response-rate information reported in
1014 surveys described in 703 articles from 17
journals from 2010 to 2020.”
“... variables that predict response-rate fluctuations
over time are related to …” 45
© Jens Nachtwei
picture: DALL:E; google fonts
Holtom, B., Baruch, Y., Aguinis, H., & A
Ballinger, G. (2022). Survey Response
Rates: Trends and a Validity
Assessment Framework. Human
Relations, 75(8), 1560–1584.
https://doi.org/ 10.1177/0018726721107
0769
p. 1560
article citations:
16
journal impact factor:
5.658
author(s) to follow:
Yehuda Baruch
Dealing with Careless
Responding in Survey Data:
Prevention, Identification, …”
“This review synthesizes the careless responding
literature to provide a comprehensive
understanding of careless responding and ways to
prevent, identify, report, and clean careless
responding from data sets. Further, we include
recommendations … .” 46
© Jens Nachtwei
picture: DALL:E; google fonts
Ward, M.K., & Meade, A. (2023).
Dealing with Careless Responding in
Survey Data: Prevention,
Identification, and Recommended
Best Practices. Annual Review of
Psychology, 74(1),
https://doi.org/ 10.1146/annurev-psych
-040422-045007
remark: pages not included;
expected publication in 01/2023
(03.01.2023)
p. 1
article citations:
0
journal impact factor:
27.782
author(s) to follow:
Adam Meade
Single Item Measures in
Psychological Science: A Call to
Action
“The increasing use of large panel surveys in
psychological research means that now more than
ever, it is essential to ensure that single-item
measures are valid and reliable.”
Arguments For and Against Single-Item
Measures (...)” 47
© Jens Nachtwei
picture: DALL:E; google fonts
Allen, M. & Iliescu, D., & Greiff, S.
(2022). Single Item Measures in
Psychological Science: A Call to
Action. European Journal of
Psychological Assessment, 38(1), 1-5.
https://doi.org/ 10.1027/1015-5759/a00
0699
p. 1
article citations:
48
journal impact factor:
2.892
author(s) to follow:
Samuel Greiff
Short Scales - Five Misunder-
standings and Ways to Overcome
Them
“1: Short Scales Will Soon Become Obsolete
... 2: Short Scales Are Intended to Substitute Longer
or Non-Abbreviated Scales”
“3: Short Scales Yield Lower Test-Criterion Correlat.”
“4: Short Scales Have to Be as Internally Consistent
“5: Short Scales Can Be Developed Overnight48
© Jens Nachtwei
picture: DALL:E; google fonts
Ziegler, M., Kemper, C. J., & Kruyen, P.
(2014). Short scales - Five
Misunderstandings and Ways to
Overcome Them. Journal of
Individual Differences, 35(4), 185-189.
https://doi.org/ 10.1027/1614-0001/a00
0148
pp. 185, 186, 187
article citations:
12
journal impact factor:
2.608
author(s) to follow:
Matthias Ziegler
Normalizing the Use of Single-
Item Measures: Validation of the
Single-Item Compendium
“... we empirically examined the degree to which
various constructs in the organizational sciences
can be reliably and validly assessed with a single
item.”
“... providing an off-the-shelf compendium of
validated single-item measures …” 49
© Jens Nachtwei
picture: DALL:E; google fonts
Matthews, R. A., Pineault, L., & Hong,
Y. H. (2022). Normalizing the Use of
Single-Item Measures: Validation of
the Single-Item Compendium for
Organizational Psychology. Journal
of Business and Psychology, 37,
639–673.
https://doi.org/ 10.1007/s10869-022-09
813-3
p. 639
article citations:
30
journal impact factor:
6.604
author(s) to follow:
Russell A. Matthews
Qualitative Research in Work
and Organizational Psychology
Journals: Practices and …”
“The aim of this position paper is to showcase the
value of qualitative research methods for the
progress of WOP research. We offer an empirical
overview and evaluation of observed practices and
explore methodological challenges that appear to
be prevalent.” 50
© Jens Nachtwei
picture: DALL:E; google fonts
Wilhelmy, A., & Köhler, T. (2022).
Qualitative Research in Work and
Organizational Psychology Journals:
Practices and Future Opportunities.
European Journal of Work and
Organizational Psychology, 31(2),
161-185.
https://doi.org/ 10.1080/1359432X.2021.
2009457
p. 161
article citations:
4
journal impact factor:
3.968
author(s) to follow:
Tine Köhler
Longitudinal Research: The
Theory, Design, and Analysis of
Change
“The purpose of this article is to present
cutting-edge research on issues relating to the
theory, design, and analysis of change.
“... we provide readers with “checklists” of issues to
consider when theorizing and designing a
longitudinal study.” 51
© Jens Nachtwei
picture: DALL:E; google fonts
Ployhart, R. E., & Vandenberg, R. J.
(2010). Longitudinal Research: The
Theory, Design, and Analysis of
Change. Journal of Management,
36(1), 94–120.
https://doi.org/ 10.1177/0149206309352
110
p. 94
article citations:
987
journal impact factor:
13.508
author(s) to follow:
Robert J Vandenberg
Conducting and Evaluating
Multilevel Studies:
Recommendations, Resources
“Multilevel methods allow researchers to investigate
relationships that expand across levels (e.g.,
individuals, teams, and organizations).”
“... we provide recommendations, tools, resources,
and a checklist that can be useful for scholars in-
volved in conducting or assessing multi. studies. 52
© Jens Nachtwei
picture: DALL:E; google fonts
González-Romá, V., & Hernández, A.
(2022). Conducting and Evaluating
Multilevel Studies:
Recommendations, Resources, and a
Checklist. Organizational Research
Methods, 0(0).
https://doi.org/ 10.1177/10944281211060
712
remark: volume, issue and pages not
included; advance online publication
(03.01.2023)
p. 1
article citations:
3
journal impact factor:
8.247
author(s) to follow:
Vicente González-Romá
descriptives, significance,
effects, tests
Does it always have to be inferential
statistics?
What should be considered when assessing
significance?
How can effect sizes be classified?
53
© Jens Nachtwei
RESULTS
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Forgetting What We Learned as
Graduate Students: HARKing and
Selective Outcome Reporting …”
“Thus, in this commentary, we sought to test,
through a pilot study, how much selective
outcome reporting (i.e., dropping nonsignificant
hypotheses) and HARKing (hypothesizing after the
results are known; Kerr, 1998) might be present in
the I–O field.” 54
© Jens Nachtwei
picture: DALL:E; google fonts
Mazzola, J., & Deuling, J. K. (2013).
Forgetting What We Learned as
Graduate Students: HARKing and
Selective Outcome Reporting in I–O
Journal Articles. Industrial and
Organizational Psychology, 6(3),
279-284.
https://doi.org/ 10.1111/iops.12049
p. 279
article citations:
31
journal impact factor:
9.375
author(s) to follow:
Jacqueline K. Deuling
The Reporting of Nonresponse
Analyses in Survey Research
“The purpose of this article is to identify how
frequently nonresponse analyses are reported and
what variables are related to these rates.”
“A number of journal and article quality measures
and sample characteristics were found to be
related to the reporting of nonresponse analyses.”
55
© Jens Nachtwei
picture: DALL:E; google fonts
Werner, S., Praxedes, M., & Kim, H.-G.
(2007). The Reporting of
Nonresponse Analyses in Survey
Research. Organizational Research
Methods, 10(2), 287–295.
https://doi.org/ 10.1177/1094428106292
892
p. 287
article citations:
107
journal impact factor:
8.247
author(s) to follow:
Steve Werner
Best-Practice Recommendations
for Defining, Identifying, and
Handling Outliers
“We provide evidence that different ways of
defining, identifying, and handling outliers alter
substantive research conclusions.”
“We offer guidelines, including decision-making
trees, that researchers can follow to define,
identify, and handle … outliers.” 56
© Jens Nachtwei
picture: DALL:E; google fonts
Aguinis, H., Gottfredson, R. K., & Joo,
H. (2013). Best-Practice
Recommendations for Defining,
Identifying, and Handling Outliers.
Organizational Research Methods,
16(2), 270–301.
https://doi.org/ 10.1177/1094428112470
848
p. 270
article citations:
688
journal impact factor:
8.247
author(s) to follow:
Herman Aguinis
Three Cheers for Descriptive
Statistics - and Five More Reasons
Why They Matter
“... we offer five other reasons for how and why
descriptive statistics and descriptive information
are so critical and why researchers, editors,
reviewers, and readers should pay more attention
to them when evaluating research findings.”
57
© Jens Nachtwei
picture: DALL:E; google fonts
Credé, M., & Harms, P. D. (2021). Three
Cheers for Descriptive Statistics - and
Five More Reasons Why They Matter.
Industrial and Organizational
Psychology, 14(4), 486-488.
https://doi.org/ 10.1017/iop.2021.110
p. 486
article citations:
0
journal impact factor:
9.375
author(s) to follow:
Peter D. Harms
In Praise of Table 1: The
Importance of Making Better Use
of Descriptive Statistics
“A set of simple methods for assessing whether
hypotheses about interventions, moderator
relationships and mediation, are plausible that are
based on the simplest possible examination of
descriptive statistics are proposed.”
58
© Jens Nachtwei
picture: DALL:E; google fonts
Murphy, K. R. (2021). In Praise of Table
1: The Importance of Making Better
Use of Descriptive Statistics.
Industrial and Organizational
Psychology, 14(4), 461-477.
https://doi.org/ 10.1017/iop.2021.90
p. 461
article citations:
20
journal impact factor:
9.375
author(s) to follow:
Kevin R. Murphy
A Review and Evaluation of
Exploratory Factor Analysis
Practices in Organiz. Research
“The authors surveyed exploratory factor analysis
(EFA) practices in three organizational journals…
“... stress the importance of careful and thoughtful
analysis, including decisions about whether and
how EFA should be used.”
59
© Jens Nachtwei
picture: DALL:E; google fonts
Conway, J. M., & Huffcutt, A. I. (2003).
A Review and Evaluation of
Exploratory Factor Analysis Practices
in Organizational Research.
Organizational Research Methods,
6(2), 147–168.
https://doi.org/ 10.1177/10944281032515
41
p. 147
article citations:
844
journal impact factor:
8.247
author(s) to follow:
Jim M. Conway
Statistical Power Analyses Using
G*Power 3.1: Tests for Correlation
and Regression Analyses
“G*Power is a free power analysis program for a
variety of statistical tests. We present extensions
and improvements of the version introduced by
Faul, Erdfelder, Lang, and Buchner (2007) in the
domain of correlation and regression analyses.”
60
© Jens Nachtwei
picture: DALL:E; google fonts
Faul, F., Erdfelder, E., Buchner, A., &
Lang, A. G. (2009). Statistical Power
Analyses Using G*Power 3.1: Tests for
Correlation and Regression Analyses.
Behavior Research Methods, 41(4),
1149–1160.
https://doi.org/ 10.3758/BRM.41.4.1149
p. 1149
article citations:
21.313
journal impact factor:
5.953
author(s) to follow:
Franz Faul
The Earth is Round (p<. 05)
“After 4 decades of severe criticism, the ritual of
null hypothesis significance testing (mechanical
dichotomous decisions around a sacred .05
criterion) still persists. This article reviews the
problems with this practice, …”
61
© Jens Nachtwei
picture: DALL:E; google fonts
Cohen, J. (1994). The Earth is Round
(p < .05). American Psychologist,
49(12), 997–1003.
https://doi.org/ 10.1037/0003-066X.49.1
2.997
p. 997
article citations:
2.602
journal impact factor:
16.358
author(s) to follow:
Jacob Cohen
Inference by Eye: Confidence
Intervals and How to Read
Pictures of Data
“The authors discuss the interpretation of figures
with error bars and analyze the relationship
between CIs and statistical significance testing.
They propose 7 rules of eye to guide the inferential
use of figures with error bars.
62
© Jens Nachtwei
picture: DALL:E; google fonts
Cumming, G., & Finch, S. (2005).
Inference by Eye: Confidence
Intervals and How to Read Pictures
of Data. American Psychologist,
60(2), 170–180.
https://doi.org/ 10.1037/0003-066X.60.
2.170
p. 170
article citations:
1.191
journal impact factor:
16.358
author(s) to follow:
Geoff Cumming
Not All Effects Are Indispensable:
Psychological Science Requires
Verifiable …”
“However, the field is now in danger of heuristically
accepting all effects as potentially important. We
aim to encourage researchers to think thoroughly
about the various mechanisms that may both
amplify and counteract the importance of an
observed effect size.” 63
© Jens Nachtwei
picture: DALL:E; google fonts
Anvari, F., Kievit, R., Lakens, D.,
Pennington, C. R., Przybylski, A. K.,
Tiokhin, L., Wiernik, B. M., & Orben, A.
(2022). Not All Effects Are
Indispensable: Psychological Science
Requires Verifiable Lines of
Reasoning for Whether an Effect
Matters. Perspectives on
Psychological Science, 0(0).
https://doi.org/ 10.1177/17456916221091
565
remark: volume, issue and pages not
included; advance online publication
(03.01.2023)
p. 1
article citations:
6
journal impact factor:
8.190
author(s) to follow:
Amy Orben
The Meaningfulness of Effect Sizes
in Psychological Research:
Differences Between Sub-Disc. …”
“... dramatic inflation in published effects…”.
“In addition, there were very large differences in
the mean effects between psychological
sub-disciplines and between different study
designs, making it impossible to apply any global
benchmarks. 64
© Jens Nachtwei
picture: DALL:E; google fonts
Schäfer, T., & Schwarz, M. (2019). The
Meaningfulness of Effect Sizes in
Psychological Research: Differences
Between Sub-Disciplines and the
Impact of Potential Biases. Frontiers
in Psychology, 10, Article 813.
https://doi.org/ 10.3389/fpsyg.2019.008
13
p. 1
article citations:
359
journal impact factor:
4.232
author(s) to follow:
Thomas Schäfer
The Sources of Four Commonly
Reported Cutoff Criteria: What
Did They Really Say?
“In this article, the authors trace four widely cited
and reported cutoff criteria to their (alleged)
original sources to determine whether they really
said what they are cited as having said about the
cutoff criteria, and if not, what the original sources
really said.” 65
© Jens Nachtwei
picture: DALL:E; google fonts
Lance, C. E., Butts, M. M., & Michels, L.
C. (2006). The Sources of Four
Commonly Reported Cutoff Criteria:
What Did They Really Say?
Organizational Research Methods,
9(2), 202–220.
https://doi.org/ 10.1177/1094428105284
919
p. 202
article citations:
1.630
journal impact factor:
8.247
author(s) to follow:
Carles E. Lance
The Illusion of Statistical Control
“The authors extend previous recommendations
for improved control variable (CV) practice in
management research by mapping the objectives
for using statistical control to recommendations
for research practice.”
Guidelines for improving research practice …”
66
© Jens Nachtwei
picture: DALL:E; google fonts
Carlson, K. D., & Wu, J. (2012). The
Illusion of Statistical Control: Control
Variable Practice in Management
Research. Organizational Research
Methods, 15(3), 413–435.
https://doi.org/ 10.1177/10944281114288
17
p. 413
article citations:
390
journal impact factor:
8.247
author(s) to follow:
-
Methodological Urban Legends:
The Misuse of Statistical Control
Variables
“The authors propose that researchers should be
explicit rather than implicit regarding the role of
control variables and match hypotheses precisely
to both the choice of variables and the choice of
analyses.”
67
© Jens Nachtwei
picture: DALL:E; google fonts
Spector, P. E., & Brannick, M. T. (2011).
Methodological Urban Legends: The
Misuse of Statistical Control
Variables. Organizational Research
Methods, 14(2), 287–305.
https://doi.org/ 10.1177/1094428110369
842
p. 287
article citations:
813
journal impact factor:
8.247
author(s) to follow:
Paul E. Spector
Mend It or End It: Redirecting the
Search for Interactions in the
Organizational Sciences
“Researchers should avoid moderator hypotheses
in contexts where the measures and research
designs employed do not allow them to be tested
in a meaningful way, and should be cautious
about interpreting the very small effects they are
likely to find. 68
© Jens Nachtwei
picture: DALL:E; google fonts
Murphy, K. R., & Russell, C. J. (2017).
Mend It or End It: Redirecting the
Search for Interactions in the
Organizational Sciences.
Organizational Research Methods,
20(4), 549–573.
https://doi.org/ 10.1177/10944281156253
22
p. 549
article citations:
62
journal impact factor:
8.247
author(s) to follow:
Kevin Murphy
Reconsidering Baron and Kenny:
Myths and Truths about
Mediation Analysis
“We present a nontechnical summary of the flaws
in the Baron and Kenny logic, some of which have
not been previously noted. We provide a decision
tree and a step-by-step procedure for testing
mediation, classifying its type, and interpreting the
implications of findings …” 69
© Jens Nachtwei
picture: DALL:E; google fonts
Zhao, X., Lynch, J. G. Jr., Chen, Q.
(2010). Reconsidering Baron and
Kenny: Myths and Truths about
Mediation Analysis. Journal of
Consumer Research, 37(2), 197–206.
https://doi.org/ 10.1086/651257
p. 197
article citations:
7.410
journal impact factor:
8.612
author(s) to follow:
John G. Lynch
When Moderation Is Mediated
and Mediation Is Moderated
“The purpose of this article is to define precisely
both mediated moderation and moderated
mediation and provide analytic strategies for
assessing each.”
70
© Jens Nachtwei
picture: DALL:E; google fonts
Muller, D., Judd, C. M., & Yzerbyt, V. Y.
(2005). When Moderation Is
Mediated and Mediation Is
Moderated. Journal of Personality
and Social Psychology, 89(6),
852–863.
https://doi.org/ 10.1037/0022-3514.89.6.
852
p. 852
article citations:
2.785
journal impact factor:
8.460
author(s) to follow:
Charles M. Judd
Inflection Points, Kinks, and
Jumps: A Statistical Approach to
Detecting Nonlinearities
“... we provide an integrative framework for the
identification of nonlinearities.
“We also provide instructions on how our
approach can be implemented, and a replicable
illustration of the procedure.”
“... illustrative example… ” 71
© Jens Nachtwei
picture: DALL:E; google fonts
Arin, P., Minniti, M., Murtinu, S., &
Spagnolo, N. (2022). Inflection Points,
Kinks, and Jumps: A Statistical
Approach to Detecting
Nonlinearities. Organizational
Research Methods, 25(4), 786–814.
https://doi.org/ 10.1177/10944281211058
466
p. 786
article citations:
3
journal impact factor:
8.247
author(s) to follow:
Maria Minniti
Answers to 20 Questions About
Interrater Reliability and
Interrater Agreement
“The purpose of the current article is to expose
researchers to the various issues surrounding the
use of IRR and IRA indices often used in
conjunction with multilevel models.”
“... question-and-answer format and provide a
tutorial … using the SPSS software.” 72
© Jens Nachtwei
picture: DALL:E; google fonts
LeBreton, J. M., & Senter, J. L. (2008).
Answers to 20 Questions About
Interrater Reliability and Interrater
Agreement. Organizational
Research Methods, 11(4), 815–852.
https://doi.org/ 10.1177/1094428106296
642
p. 815
article citations:
2.437
journal impact factor:
8.247
author(s) to follow:
James M. LeBreton
Cronbach's Coefficient Alpha:
Well Known but Poorly
Understood
“This study disproves … six common
misconceptions about coefficient alpha:”
“This study discusses the inaccuracy of each of
these misconceptions and provides a correct
statement.”
73
© Jens Nachtwei
picture: DALL:E; google fonts
Cho, E., & Kim, S. (2015). Cronbach’s
Coefficient Alpha: Well Known but
Poorly Understood. Organizational
Research Methods, 18(2), 207–230.
https://doi.org/ 10.1177/1094428114555
994
p. 207
article citations:
475
journal impact factor:
8.247
author(s) to follow:
-
Multilevel Methods and
Statistics: The Next Frontier
“A basic overview of the history and progression of
multilevel methods and statistics in the
organizational sciences is provided, as well as a
discussion of recent developments to summarize
the current state of the science.”
74
© Jens Nachtwei
picture: DALL:E; google fonts
Eckardt, R., Yammarino, F. J., Dionne,
S. D., & Spain, S. M. (2021). Multilevel
Methods and Statistics: The Next
Frontier. Organizational Research
Methods, 24(2), 187–218.
https://doi.org/ 10.1177/1094428120959
827
p. 187
article citations:
7
journal impact factor:
8.247
author(s) to follow:
Shelley D. Dionne
Using Thematic Analysis in
Psychology
“We outline what thematic analysis is, locating it in
relation to other qualitative analytic methods…
“We then provide clear guidelines to those
wanting to start thematic analysis, or conduct it in
a more deliberate and rigorous way, and consider
potential pitfalls …”
75
© Jens Nachtwei
picture: DALL:E; google fonts
Braun, V., & Clarke, V. (2006). Using
Thematic Analysis in Psychology.
Qualitative Research in Psychology,
3(2), 77-101.
DOI: 10.1191/1478088706qp063oa
p. 77
article citations:
93.357
journal impact factor:
10.568
author(s) to follow:
Virginia Braun
Customer-Centric Science:
Reporting Significant Research
Results with Rigor, Relevance, …”
“In response to the ongoing concern regarding a
science-practice gap, we propose a
customer-centric approach to reporting
significant research results that involves a
sequence of three interdependent steps.”
76
© Jens Nachtwei
picture: DALL:E; google fonts
Aguinis, H., Werner, S., Lanza Abbott,
J., Angert, C., Hyung, J., &
Kohlhausen, D. (2010).
Customer-Centric Science: Reporting
Significant Research Results with
Rigor, Relevance, and Practical
Impact in Mind. Organizational
Research Methods, 13(3), 515–539.
https://doi.org/ 10.1177/1094428109333
339
p. 515
article citations:
210
journal impact factor:
8.247
author(s) to follow:
Herman Aguinis
implications, biases,
reflections
How should the implications of my research
be discussed?
What are the limitations of my research?
What can I use for my critical reflection?
77
© Jens Nachtwei
DISCUSSION
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Small Effects: The Indispensable
Foundation for a Cumulative
Psychological Science
“... we highlight the dangers of a publication
culture that continues to demand large effects.”
“We then explain the theoretical and practical
relevance of small effects …”
“Finally, we suggest ways in which scholars can
harness these insights to advance research …” 78
© Jens Nachtwei
picture: DALL:E; google fonts
Götz, F. M., Gosling, S. D., & Rentfrow,
P. J. (2022). Small Effects: The
Indispensable Foundation for a
Cumulative Psychological Science.
Perspectives on Psychological
Science, 17(1), 205–215.
https://doi.org/ 10.1177/1745691620984
483
p. 205
article citations:
112
journal impact factor:
8.190
author(s) to follow:
Samuel D. Gosling
Estimating the Reproducibility of
Psychological Science
“We conducted replications of 100 experimental
and correlational studies published in three
psychology journals using high-powered designs
and original materials when available.”
“Replication effects were half the magnitude of
original effects …”
79
© Jens Nachtwei
picture: DALL:E; google fonts
Aarts, A. A. et al. (2015). Estimating
the Reproducibility of Psychological
Science. Science, 349(6251), Article
aac4716.
http://dx.doi.org/10.1126/science.aac4
716
p. 1
article citations:
4.823
journal impact factor:
63.714
author(s) to follow:
-
Understanding “It Depends” in
Organizational Research: A
Theory-Based Taxonomy, …”
“... we present a taxonomy of two-way interaction
effects that can guide organizational scholars
toward clearer, more precise ways of developing
theory, advancing hypotheses, and interpreting
results.”
80
© Jens Nachtwei
picture: DALL:E; google fonts
Gardner, R. G., Harris, T. B., Li, N.,
Kirkman, B. L., & Mathieu, J. E. (2017).
Understanding “It Depends” in
Organizational Research: A
Theory-Based Taxonomy, Review,
and Future Research Agenda
Concerning Interactive and
Quadratic Relationships.
Organizational Research Methods,
20(4), 610–638.
https://doi.org/ 10.1177/1094428117708
856
p. 610
article citations:
124
journal impact factor:
8.247
author(s) to follow:
John E. Mathieu
Method Effects, Measurement
Error, and Substantive
Conclusions
“The authors review research that has used
multitrait-multimethod (MTMM) designs to
estimate the magnitude of common method
variance in organizational research.”
81
© Jens Nachtwei
picture: DALL:E; google fonts
Lance, C. E., Dawson, B., Birkelbach,
D., & Hoffman, B. J. (2010). Method
Effects, Measurement Error, and
Substantive Conclusions.
Organizational Research Methods,
13(3), 435-455.
https://doi.org/ 10.1177/1094428109352
528
p. 435
article citations:
298
journal impact factor:
8.247
author(s) to follow:
Chuck Lance
A Review of Measurement
Equivalence in Organizational
Research: …”
Nonequivalence across groups or over time can
affect the results of a study and the conclusions
that are drawn from it.”
“... describes recent advances in testing for ME and
proposes a taxonomy of potential sources of
nonequivalence. Finally, recommendations …” 82
© Jens Nachtwei
picture: DALL:E; google fonts
Somaraju, A. V., Nye, C. D., & Olenick,
J. (2022). A Review of Measurement
Equivalence in Organizational
Research: What’s Old, What’s New,
What’s Next? Organizational
Research Methods, 25(4), 741–785.
https://doi.org/ 10.1177/10944281211056
524
p. 741
article citations:
2
journal impact factor:
8.247
author(s) to follow:
Christopher D. Nye
The Use and Misuse of
Organizational Research Methods
‘Best Practice’ Articles
“This study explores how researchers in the
organizational sciences use and/or cite
methodological ‘best practice’ (BP) articles.”
“... how we can better improve our digestion and
implementation of best practices as we design
and test research and theory.” 83
© Jens Nachtwei
picture: DALL:E; google fonts
Kreamer, L. M., Albritton, B. H.,
Tonidandel, S., & Rogelberg, S. G.
(2021). The Use and Misuse of
Organizational Research Methods
‘Best Practice’ Articles.
Organizational Research Methods,
0(0).
https://doi.org/ 10.1177/10944281211060
706
remark: volume, issue and pages not
included; advance online publication
(03.01.2023)
p. 1
article citations:
2
journal impact factor:
8.247
author(s) to follow:
Scott Tonidandel
Validity
“Validity refers to the degree to which (empirical)
evidence and theory support the interpretation(s)
of test scores for the intended uses of those or the
degree to which empirical evidence and theory
support the interpretation of results derived from
experiments.” 84
© Jens Nachtwei
picture: DALL:E; google fonts
Ziegler, M., & Lämmle, L. (2017).
Validity. In V. Zeigler-Hill & T.
Shackelfor (Eds.), Encyclopedia of
Personality and Individual
Differences (pp. 1-7). Springer.
https://doi.org/ 10.1007/978-3-319-280
99-8_1354-1
p. 1
article citations:
0
journal impact factor:
-
author(s) to follow:
Matthias Ziegler
impact, role of researcher
How can we increase the social impact of
our research?
How do I communicate with the public?
What is the role of the researcher?
85
© Jens Nachtwei
THIRD MISSION
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
EJWOP Special Issue: Enhancing
the Social Impact of Research in
Work and Organizational Psy. …”
“Our stated aim in this special issue was to give
researchers and practitioners in work and organi-
zational psychology new and significant insights,
ideas and tools that will enable research in our
field to make a greater positive impact than it has
so far in the world outside academic research.” 86
© Jens Nachtwei
picture: DALL:E; google fonts
Arnold, J., Dries, N., & Gabriel, Y. (2021).
EJWOP Special Issue: Enhancing the
Social Impact of Research in Work
and Organizational Psychology –
Beyond Academia. European
Journal of Work and Organizational
Psychology, 30(3), 329-338.
https://doi.org/ 10.1080/1359432X.2021.
1915293
p. 333
article citations:
5
journal impact factor:
3.968
author(s) to follow:
Yiannis Gabriel
Bringing I-O Psychology to the
Public: But What if We Have
Nothing to Say?
“... unpack the underlying reasons why existing
mechanisms do not allow any relevant and mean-
ingful exchange between theory and practice.
“... how this lack of meaningful interaction hinders
effective communication between scholars,
practitioners, and the broader public in general.” 87
© Jens Nachtwei
picture: DALL:E; google fonts
Orhan, M. A., Bal, P. M., & van
Rossenberg, Y. G. T. (2022). Bringing
IO Psychology to the Public: But
What if We Have Nothing to Say?
Preprint, Unpublished Commentary.
https://dx.doi.org/10.2139/ssrn.403588
5
p. 1
article citations:
1
journal impact factor:
-
author(s) to follow:
P. Matthijs Bal
writing, writing, writing
How do I evaluate journals in which I might
want to publish?
What makes an article interesting?
What writing aids are there?
88
© Jens Nachtwei
PUBLICATION
picture: DALL:E Click on the title of the article on the following slides to go to the article's ResearchGate page.
Prestige and Relevance of the
Scholarly Journals: Impressions of
SIOP Members
“... the primary indicators of journal prestige (i.e.,
impact factors) do not directly assess audience
admiration.”
Changes in the journal landscape are discussed,
including the rise of OHP as a topic of
concentration in I-O.” 89
© Jens Nachtwei
picture: DALL:E; google fonts
Highhouse, S., Zickar, M., & Melick, S.
(2020). Prestige and Relevance of the
Scholarly Journals: Impressions of
SIOP Members. Industrial and
Organizational Psychology, 13(3),
273-290.
https://doi.org/ 10.1017/iop.2020.2
p. 273
article citations:
35
journal impact factor:
9.375
author(s) to follow:
Scott Highhouse
When I Write My Masterpiece:
Thoughts on What Makes a Paper
Interesting
“This article reflects on academic papers and what
makes them interesting. The author believes
academic papers are like rock and roll bands:
whether an audience finds them interesting is a
matter of perspective, if not taste.”
90
© Jens Nachtwei
picture: DALL:E; google fonts
Barley, S. R. ( 2006). When I Write My
Masterpiece: Thoughts On What
Makes A Paper Interesting. The
Academy of Management Journal,
49(1), 16-20.
https://doi.org/ 10.5465/amj.2006.2078
5495
p. 16
article citations:
111
journal impact factor:
10.979
author(s) to follow:
Stephen R. Barley
Ten Simple Rules for Structuring
Papers
“Focusing on how readers consume information,
we present a set of ten simple rules to help you
communicate the main idea of your paper. These
rules are designed to make your paper more
influential and the process of writing more
efficient and pleasurable.”
91
© Jens Nachtwei
picture: DALL:E; google fonts
Mensh, B., & Kording, K. (2017). Ten
Simple Rules for Structuring Papers.
PLoS Computational Biology, 13(9),
Article e1005619.
https://doi.org/ 10.1371/journal.pcbi.100
5619
p. 1
article citations:
44
journal impact factor:
4.475
author(s) to follow:
-
Open Access Publishing – Noble
Intention, Flawed Reality
“... we highlight some of inequities OA presents for
junior or unfunded researchers, and academics
from resource-poor environments, for whom an
increasing body of evidence shows clear evidence
of discrimination and injustice caused by Article
Processing Charges.”
92
© Jens Nachtwei
picture: DALL:E; google fonts
Frank, J., Foster, R., & Pagliari, C.
(2023). Open Access Publishing –
Noble Intention, Flawed Reality.
Social Science & Medicine, 317,
Article 115592.
https://doi.org/ 10.1016/j.socscimed.20
22.115592
p. 1
article citations:
0
journal impact factor:
4.634
author(s) to follow:
Claudia Pagliari
Journal Article Reporting
Standards for Quantitative
Research in Psychology: …”
“Following a review of extant reporting standards
for scientific publication, and reviewing 10 years of
experience … the APA Working Group on
Quantitative Research Reporting Standards
recommended some modifications to the original
standards.” 93
© Jens Nachtwei
picture: DALL:E; google fonts
Appelbaum, M., Cooper, H., Kline, R.
B., Mayo-Wilson, E., Nezu, A. M., &
Rao, S. M. (2018). Journal Article
Reporting Standards for Quantitative
Research in Psychology: The APA
Publications and Communications
Board Task Force Report. American
Psychologist, 73(1), 3–25.
https://doi.org/ 10.1037/amp0000191
p. 3
article citations:
539
journal impact factor:
16.358
author(s) to follow:
Harris Cooper
Journal Article Reporting
Standards for Qualitative … Mixed
Methods Research in Psy. …”
“This publication marks a historical moment—the
first inclusion of qualitative research in APA Style,
“... standards for both qualitative meta-analysis
and mixed methods research.”
“... present these standards and their rationale …”
94
© Jens Nachtwei
picture: DALL:E; google fonts
Levitt, H. M., Bamberg, M., Creswell, J.
W., Frost, D. M., Josselson, R., &
Suárez-Orozco, C. (2018). Journal
Article Reporting Standards for
Qualitative Primary, Qualitative
Meta-analytic, and Mixed Methods
Research in Psychology: The APA
Publications and Communications
Board Task Force Report. American
Psychologist, 73(1), 26–46.
https://doi.org/ 10.1037/amp0000151
p. 26
article citations:
937
journal impact factor:
16.358
author(s) to follow:
Michael Bamberg
Scientific Utopia: II. Restructuring
Incentives and Practices to Pro-
mote Truth Over Publishability
“Publishing norms emphasize novel, positive
results.”
“This article develops strategies for improving
scientific practices and knowledge accumulation
that account for ordinary human motivations and
biases.” 95
© Jens Nachtwei
picture: DALL:E; google fonts
Nosek, B. A., Spies, J. R., & Motyl, M.
(2012). Scientific Utopia: II.
Restructuring Incentives and
Practices to Promote Truth Over
Publishability. Perspectives on
Psychological Science, 7(6), 615–631.
https://doi.org/ 10.1177/17456916124590
58
p. 615
article citations:
1.055
journal impact factor:
8.190
author(s) to follow:
Brian Nosek
96
How to crack it?
Bild: pixabay
(4)
© Jens Nachtwei
Choose a topic you are passionate about!
Read the relevant papers (see RAPTOR).
Write an idea outline and get feedback.
Write an exposé and get feedback.
Fight your way through empiricism and analysis.
Establish a writing routine that works for you.
Stay in contact with your supervisor - always!
Concentrate on your research, not on the grade.
how to ace your thesis
97
Bild: pixabay
98
© Jens Nachtwei
(5)Q & A
Thank you!
Feedback always
welcome!
You have new sources for
RAPTOR? I’d love to see
them!
Best of luck with your
research!
99
© Jens Nachtwei
picture: DALL:E
RAPTOR (Research in Applied Psychology for Teaching: Open Repository)
version 1.0, 04.01.2023
jens.nachtwei@hu-berlin.de
© Jens NachtweiUse, modification and distribution of the slides only with the written permission of the copyright holder.
ResearchGate has not been able to resolve any citations for this publication.
ResearchGate has not been able to resolve any references for this publication.