Content uploaded by José Navarro
Author content
All content in this area was uploaded by José Navarro on Nov 04, 2021
Content may be subject to copyright.
The baby and the bathwater: on the need for substantive–methodological synergy in
organizational research
Joeri Hofmans1*, Alexandre J. S. Morin2, Heiko Breitsohl3 , Eva Ceulemans4, Léandre Alexis
Chénard-Poirier5, Charles C. Driver6, Claude Fernet5, Marylène Gagné7, Nicolas Gillet8, Vicente
González-Romá9, Kevin J. Grimm10, Ellen L. Hamaker11, Kit-Tai Hau12 , Simon A. Houle2, Joshua L.
Howard13, Rex B. Kline2, Evy Kuijpers1, Theresa Leyens1, David Litalien14, Anne Mäkikangas15,
Herbert W. Marsh16, Matthew J. W. McLarnon17, John P. Meyer18, Jose Navarro19,Elizabeth Olivier20,
Thomas A. O’ Neill21, Reinhard Pekrun22, Katariina Salmela-Aro23, Omar N. Solinger24, Sabine
Sonnentag25, Louis Tay26, István Tóth-Király2, Robert J. Vallerand5, Christian Vandenberghe27,
Yvonne G. T. van Rossenberg28, Tim Vantilborgh1, Jasmine Vergauwe29, Jesse T. Vullinghs1, Mo
Wang30, Zhonglin Wen31, and Bart Wille29
1Vrije Universiteit Brussel, Belgium, 2Concordia University, Canada, 3University of Klagenfurt,
Austria, 4KU Leuven, Belgium, 5Université du Québec à Montréal Trois-Rivières, Canada, 6University
of Zurich, Switzerland, 7Curtin University, Australia, 8Université de Tours, France, & Institut
Universitaire de France (IUF), France, 9University of Valencia, Spain, 10Arizona State University,
USA, 11Utrecht University, The Netherlands, 12The Chinese University of Hong Kong, Hong Kong
SAR, 13Monash University, Australia, 14Université Laval, Canada, 15Tampere University, Finland,
16Australian Catholic University, Australia, & The University of Oxford, UK, 17Mount Royal
University, Canada, 18The University of Western Ontario, Canada & Curtin University, Australia,
19University of Barcelona, Spain, 20Université de Montréal, Canada, 21University of Calgary, Canada,
22University of Essex, United Kingdom, and Australian Catholic University, Australia, 23University of
Helsinki, Finland, 24Vrije Universiteit Amsterdam, The Netherlands, 25University of Mannheim,
Germany, 26Purdue University, USA, 27HEC Montreal, Canada, 28Radboud University, the
Netherlands, 29Ghent University, Belgium, 30University of Florida, USA and 31South China Normal
University, China
Acknowledgments: The first and second author contributed equally to this article and their order was
determined at random. Both should be considered first authors. Beyond the first and second authors,
the order of appearance of all other co-authors was determined alphabetically (as a function of their
family names). The second author was supported by a grant from the Social Sciences and Humanities
Research Council of Canada (435-2018-0368).
Corresponding author:
Joeri Hofmans
Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussel
joeri.hofmans@vub.be
Hofmans, J., Morin, AJS., Breitsohl, H., Ceulemans, E., Chénard-Poirier, LA., Driver, CC., Fernet, C.,
Gagné, M., Gillet, N., González-Romá, V., Grimm, KJ., Hamaker, EL., Hau, K-T., Houle, SA.,
Howard, JL., Kline, RB., Kuijpers, E., Leyens, T., Litalien, D., Mäkikangas, A., Marsh, HW.,
McLarnon, MJW., Meyer, JP., Navarro, J., Olivier, E., O’Neill, TA., Pekrun, R., Salmela-Aro, K.,
Solinger, ON., Sonnentag, S., Tay, L., Tóth-Király, I., Vallerand, RJ., Vandenberghe, C., van
Rossenberg, YGT., Vantilborgh, T., Vergauwe, J., Vullinghs, JT., Wang, M., Wen, Z., and Wille, B.
(2021). The baby and the bathwater: on the need for substantive–methodological synergy in
organizational research. Industrial and Organizational Psychology. doi:
https://doi.org/10.1017/iop.2021.111
© 2021. This paper is not the copy of record and may not exactly replicate the final, authoritative
version of the article published in Industrial and Organizational Psychology. The final authenticated
version is available online at https://doi.org/10.1017/iop.2021.111
1
The baby and the bathwater: on the need for substantive–methodological synergy in
organizational research
Murphy (2021) argues that the field of Industrial-Organizational (I/O) Psychology needs to pay
more attention to descriptive statistics (“Table 1”; e.g., M, SD, reliability, correlations) when reporting
and interpreting results. We agree that authors need to present a clear and transparent description of
their data and that descriptive statistics and plots can be helpful in making sense of one’s data and
analyses (Tay et al., 2016). Many journals already require this. Although this information can be
presented in the manuscript, more details can be placed in online supplements where there are fewer
space limitations (e.g., detailed presentation and discussion of descriptive statistics, missing data and
outliers, plots and diagrams, conceptual issues, and computer syntax). However, we strongly disagree
with the claim that “increasing complexity and diversity of data-analytic methods in organizational
research has created several problems in our field” (p. 2). This claim suffers from two important
oversights: (1) it neglects the crucial role of methodological fit, or the notion that theory, methods, and
analyses need to be aligned, and (2) it neglects the fact that in I/O research, most constructs are not
directly observable but need to be inferred indirectly though latent variable models. We expand on
both issues, using examples to illustrate that the complexity and diversity of data-analytic methods is
not a threat but a blessing for I/O research (and beyond). Finally, we conclude by highlighting the
need for substantive-methodological synergies to solve some of the issues raised by Murphy (2021).
The Importance of Methodological Fit
Methodological fit refers to the “internal consistency among elements of a research project”
(Edmondson & McManus, 2007, p. 1155). It concerns the alignment between research questions, prior
work on the topic, research design, analyses, and contribution (Hamaker et al., 2020). Methodological
fit implies that data-analytic methods need to be attuned to theoretical questions, an idea that is
reflected in the statement “extraordinary claims require extraordinary evidence.” Human behavior is
dynamic, multifaceted, complex, and occurs in interaction with equally complex social systems. We
thus need sophisticated designs and methods to capture this complexity if we want to understand
human behavior. In I/O research this complexity takes several forms (discussed below), thus placing
2
requirements of our methods, all of which go beyond the consideration of descriptive statistics.
Many phenomena in I/O Psychology are multidetermined. Because many of the phenomena in
which I/O psychologists are interested are multidetermined (involving dispositional, situational,
organizational, and societal sources of influences) and involved in complex causal chains (including
mediation and moderation), the idea of methodological fit implies that to address these complex
issues, multivariate data-analytic methods are required. Although several examples can be given, one
that recently gained attention is balanced need satisfaction. More precisely, individuals with balanced
satisfaction in the needs for autonomy, competence, and relatedness should experience higher levels of
wellbeing than people with the same aggregate level of, yet less balanced, need satisfaction (Sheldon
& Niemiec, 2006). Importantly, the role of need (im)balance cannot be tested using descriptive
statistics because it necessitates an inherently multivariate approach (Gillet et al., 2020).
Much of our data have a nested data structure. Because people typically work in teams, and
teams in organizations, I/O data often have a nested data structure. The same is true when repeated
measurements are nested within individuals. Such data structures create dependencies in the data (e.g.,
the commitment of employees working within the same group is likely to be more similar than that of
employees from different groups; Schreurs et al., 2021). These dependencies need to be considered to
properly analyze such data, necessitating multilevel methods (Morin et al., in press). Two examples
show how crucial methodological fit is for multilevel data. In educational psychology, the Big-Fish-
Little-Pond effect (Marsh et al., 2014) shows that effect of classroom levels of achievement on the
academic self-concept (negative) differs from that of individual levels of achievement (positive) due to
social comparison processes. Likewise, McCormick et al.’s (2020) meta-analysis shows that, when
comparing between-person and within-person associations (repeated measurements), associations are
different across levels of analysis 24.1% of the time. Hence, when dealing with nested data, one needs
to take these dependencies into account, which necessitates multilevel or time-structured models. This
means that raw scores and descriptive statistics alone might not be sufficiently informative.
Psychological phenomena are usually dynamic. I/O research is often interested in phenomena
which develop, evolve, and change over time. These phenomena can have a beginning, a development,
an evolution, and a completion, and many of them (e.g., self-esteem) are known to present trait and
3
state components (Perinelli & Alessandri, 2020). Job attitudes change over time, performance is
dynamic, and affect at work is incredibly fluctuating, just to mention a few examples. Expecting these
changes and fluctuations to be accurately represented by simple descriptive statistics is, at best,
unrealistic and becomes impossible when facing nonlinearity and inter-individual variation in shape
(e.g., Navarro et al., 2020). Even when considering the relatively simple case of mediation, Murphy’s
(2021) implicitly assumes that we can test mediation from a static perspective, which ignores the fact
that most of our theories assume the presence of dynamic psychological processes that unfold over
time. For this reason, accurate tests of mediation require longitudinal designs (Cole & Maxwell, 2003)
and the ability to disaggregate sources of within-person versus between-person variation
1
.
The questionable assumption of population homogeneity. Most I/O studies implicitly assume that
a single set of “averaged” parameters can be used to describe the population. Yet, awareness is
growing that this assumption is often too simplistic (Meyer & Morin, 2016). Hofmans et al. (2020)
argue that several of our theories imply population heterogeneity rather than homogeneity: People can
hold distinct configurations on a series of indicators reflecting their career orientation (McLarnon et
al., 2015), their motivation (Tóth-Király et al., 2021), or their commitment (Meyer & Morin, 2016),
and can follow distinct longitudinal trajectories (Fernet et al., 2020). To detect heterogeneity, person-
centered analyses are needed (Meyer & Morin, 2016). For instance, Solinger et al. (2013) revealed that
the bond between newcomers and their organizations could develop in different ways, whereas Morin
et al. (2013) demonstrated the indissociable nature of self-concept levels and stability. Identifying
profiles of employees is also more naturally aligned with managers’ tendencies to think in terms of
categories (Mäkikangas & Kinnunen 2016). The person-centered approach is thus particularly helpful
to guide intervention strategies tailored at the needs of distinct types of employees, an approach that
has yielded benefits for burnout intervention (Hätinen et al., 2009). If we want to relax the often-
unrealistic assumption of population homogeneity, we need to embrace complex data-analyses.
Moreover, heterogeneity cannot be inferred from the inspection of descriptive statistics and
1
In addition, contrary to Murphy’s (2021) claim, it is not necessary for x and y to be correlated for a partial
mediation to exist in the form of x-m-y given that the indirect x-y relation can have a different sign than the
direct x-y relation, and may cancel each other out (Zhao et al., 2010).
4
correlations, which suggests that the interpretation of such sample-level statistics can be misleading.
Many theories, even simple in appearance, cannot be empirically tested without complex data-
analyses. Many good theoretical models seek to capture the complex nature of human reality
2
, which
requires complex data-analyses. For instance, Lawler’s (1992) theory of empowerment—highlighting
the role of complementariness and coherence in leaders’ empowerment practices—has long been used
in textbooks and as a guide for intervention. However, a proper test of this theory has been lacking
until Chénard-Poirier et al. (2017) found partial support for its propositions using a hybrid mixture
regression approach. Similarly, to test the dynamic model of the psychological contract properly, dual
regime models are required. Indeed, dual regime models “mimic the theoretical processes underlying
the elicitation of violation feelings via two model components: A binary distribution that models
whether an event in one’s work environment leads to a crossing of the acceptance limits of the
psychological contract, and a count distribution that models how severe the negative impact of this
crossing is” (Hofmans, 2017, p. 8). Lastly, combining many previous issues (person-centered,
multilevel, theoretical complexity), O’Neill et al. (2018) identified distinctive patterns of task, process,
and relationship conflict located at the team-level that “variable-centered” studies failed to uncover.
As illustrated, merely examining descriptive statistics is not sufficiently informative for many of
our research questions, unless we are ready – as a field – to dramatically simplify our theories. In that
sense, Murphy (2021) seems to argue in favor of theoretical abstraction and simplicity (assuming a
limited set of grand universal laws; Healy, 2017), while we believe that there is also value in
understanding complexity (Tsoukas, 2017). Life if inherently complex – it is interconnected, multi-
faceted, paradoxical, ever-changing – and advanced statistics help us come to grips with it.
The Need for Latent Variable Models
In I/O Psychology, many constructs are not directly observable. We rely on questionnaire data,
where responses to multiple items are assumed to reflect an underlying psychological construct. These
unobservable constructs do not possess readily established measurement units (like sex, or tenure), but
units that emerge from our data analytic models. These units can be a function of the response scale or
2
Although some theories themselves might be too complex to be truly useful (Saylors &Trafimow, 2021).
5
the distribution of scores obtained in the sample (standardization) or population (norms) (Meyer &
Morin, 2016). Our measures are thus, by definition, imperfect, which makes descriptive statistics
equally imperfect. Indirect measurement poses several challenges, which can be resolved using latent
variable models. In what follows, we list some of those challenges that illustrate why latent variable
models are critical for advancing our understanding of I/O phenomena.
As a starting point, studies that use multiple indicators to measure any construct should routinely
test the measurement model relating these indicators to the latent factors. Without support for the
measurement of the constructs, subsequent analyses are dubious. The full measurement model should
be presented in sufficient detail to be evaluated, perhaps in online supplements. The latent correlation
matrix among constructs based on this measurement model is more useful and accurate than the
manifest correlation matrix suggested by Murphy (2021). Hence, the measurement model is central for
both Murphy's (2021) descriptive goals and as a bridge to more complex models.
Random measurement error. Due to the imperfect nature of our measures, manifest scores
contain random measurement error, which attenuates our estimates of associations between variables.
In most research settings, reliable measurement (i.e., true score variance: The total variance minus the
variance due to random measurement error) is reflected in the covariance among ratings obtained
across various indicators of the same construct, whereas the unique part of each indicator incorporates
random measurement error. However, in non-latent analytic models, these two sources of variance are
conflated. Whereas latent variable models naturally separate them, allowing for tests of associations
corrected for random measurement error, properly accounting for measurement error is far more
complex than simply relying on latent variable models. For example, Marsh and Hau (1996)
demonstrated that test-retest correlations based on manifest variables tend to be positively biased by
the failure to account for longitudinal correlated uniquenesses among the matching indicators used
repeatedly over time. Marsh et al. (2010a) similarly demonstrated the need to account for wording
effects (e.g., negative wording, parallel wording) to achieve an accurate representation of the structure
of our constructs. With multilevel data, measurement error due to “inter-item agreement” occurs
separately across levels of analysis, and “inter-rater agreement” between members of the higher-level
reality (e.g., a workgroup) are also likely to bias measurement (Morin et al., in press). To make
6
matters worse, Marsh et al. (2010b) demonstrated that rather than attenuating associations, multilevel
sources of measurement error, in combination, could create artificial associations between constructs
(referred to as phantom effects). Finally, despite their well-controlled nature, laboratory experiments
are subject to the same challenges, including biased estimates of intervention effects (Breitsohl, 2019).
These examples all demonstrate that descriptive results based on manifest variables can be misleading
when working with unobservable constructs.
Disentangling distinct sources of variance. Recent developments have shown that measurement
issues tend to be a lot more complex than previously thought (Morin et al., 2020). For example,
statistical research has highlighted the need to account for distinct forms of true score variance in
conceptually-related and hierarchically-ordered constructs (Morin et al., 2017). Exploratory structural
equation modeling has been recommended as a way to account for the presence of conceptually-
related constructs by incorporating cross-loadings (Asparouhov et al., 2015), while bifactor modeling
has been recommended for hierarchically-ordered constructs (Morin et al., 2017). In research
involving ratings from different sources (e.g., teams, supervisor, assessment centers), more complex
latent variable models and mixed effects models for cross-classified data are needed to isolate the
multiple sources of variability present in these ratings (O’Neill et al., 2015). Such latent variable
techniques have advanced our knowledge of measurement in a way that would have been impossible
using descriptive statistics, in addition to help model the complexity of real-world phenomena.
Improved techniques for testing moderation. Moderator effects are key to many models and
theories. Murphy (2021) rightfully argues that tests of moderation suffer from several issues, including
low reliability (and associated low power) of the interaction term. However, rather than taking a step
back and reverting to descriptive statistics, it is equally valuable to take a step forward and work with
latent variable models, which make it possible to tackle moderation in a way that accounts for
unreliability (Marsh et al., 2013). Once again, why not use the strengths of our complex data-analytic
methods, which offer clear solutions to many of the issues that we face in our field.
The Baby and the Bathwater: The Need for Substantive-Methodological Synergy
Having argued that complex data-analytic models are critical to address the complex questions
that we ask in I/O research, we share Murphy’s (2021) concern about the incorrect application and
7
interpretation of those methods and the growing science-practice gap. This phenomenon has been
known for decades (see Borsboom, 2006; Marsh & Hau, 2007) and can be tied to multiple issues,
including (a) the lack of proper statistical training in graduate school, (b) the fact that applied
researchers often struggle to keep pace with the fast-pace methodological innovations and the equally
fast-pace evolution of their theoretical fields after graduate school, and (c) the fact that some
methodological experts sometimes lose sight of the true needs of applied researchers and (d) present
statistical innovations in a formal (equation-based) manner that falls beyond the understanding of
applied researchers. However, Murphy (2021) seems to fail to recognize that “simple” techniques are
only seemingly simpler because they make more simplifying assumptions, many of which may be
false (oversimplification). In other words, the “burden of assumptions” may be heavier with simple
techniques. Moreover, when pushed to the extreme, resorting to descriptive statistics to capture reality
may come to imply that inferential statistics and modeling should be replaced by storytelling. Despite
our sympathy with Murphy’s claim that more attention should be afforded to descriptive statistics, we
thus believe in an alternative and more creative route: Substantive-methodological synergies.
The term substantive-methodological synergy has been proposed by Marsh and Hau (2007) to
describe joint ventures in which new methods provide novel insights into important substantive
issues. Such joint ventures involve collaboration between substantive and methodological experts to
ensure that (a) the methods are applied correctly to match the needs of the research area and that the
findings are translated in a meaningful way to applied researchers and practitioners, but also that (b)
new methodological development are connected to the needs of applied researchers and translated in a
way that makes sense to them. For example, the growing field of Big Data and Data Science in I/O
Psychology requires both theoretical and methodological sophistication (Woo et al., 2020). Thus,
rather than reverting to the simplest tools, I/O scholars need to acknowledge that human behavior is
complex and that advanced methods are needed to capture that complexity. Unfortunately, true
substantive-methodological synergies (i.e., articles where theory and methods are positioned as dual
objectives) remain unwelcome in many I/O journals, due to the false idea that articles should tackle
one main objective and that methodologically-oriented articles should be sent to methodological
journals. Moreover, providing complete and accurate coverage and interpretation of both theoretical
8
and methodological components typically requires more space than that available in I/O journals.
We hope that this article might reduce these obstacles and pave the way for substantive-
methodological synergies in I/O research. Rather than throwing the baby (i.e., proper statistical
modeling) out with the soiled bathwater (the challenges posed by the correct application and
interpretation of these methods), substantive-methodological synergies make it possible for I/O
Psychologists to solve the true problems raised by Murphy (2021), without sacrificing the theoretical
richness of our field. Researchers should not shy away from complex methods, which have been born
out of a genuine need to model human complexity. We do not see these synergies as the only way
forward, but as one out of many (e.g., including better statistical training) which may help to reduce
the ever increasing gap between statistical developments and applied research. We are also not
claiming that “simple” research has no value, nor that fundamental statistical developments should
stop. Rather, we are simply claiming that more work is needed to bridge those two.
Finally, sharing Murphy’s concern about the science-practice gap, we agree that we need to
educate ourselves and our readers more properly regarding the functionality and interpretability of our
models and make a concerted effort to explain how our findings can inform practice. If anything, the
need for clarity increases as questions and analyses become more complex. However, clarity is not
synonymous with simplicity and, while we agree that “Table 1” is important, we have provided
several examples that ‘simple’ statistics can be misleading and/or inappropriate for the research
question at hand. It is thus incumbent upon researchers seeking to achieve substantive-methodological
synergy to keep in mind the need to communicate their findings clearly to achieve that synergy.
References
Asparouhov, T., Muthén, B.O., & Morin, A.J.S. (2015). Bayesian structural equation modeling with
cross-loadings and residual covariances. Comments on Stromeyer et al. Journal of
Management, 41(6), 1561-1577.
Borsboom, D. (2006). The attack of the psychometricians. Psychometrika, 71(3), 425-440.
Breitsohl, H. (2019). Beyond ANOVA: An introduction to structural equation models for
experimental designs. Organizational Research Methods, 22(3), 649-677.
Chénard-Poirier, L.A., Morin, A.J.S., & Boudrias, J.S. (2017). On the merits of coherent leadership
9
empowerment behaviors: A mixture regression approach. Journal of Vocational Behavior, 103, 66-75.
Cole, D.A., & Maxwell, S.E. (2003). Testing mediational models with longitudinal data. Journal of
Abnormal Psychology, 112(4), 558-577.
Edmondson, A.C., & McManus, S.E. (2007). Methodological fit in management field research.
Academy of Management Review, 32(4), 1155-1179.
Fernet, C., Morin, A.J.S., Austin, S., Gagné, M., Litalien, D., Lavoie-Tremblay, M., & Forest, J.
(2020). Self-determination trajectories at work: A growth mixture analysis. Journal of
Vocational Behavior, 121, 103473
Gillet, N., Morin, A.J.S., Huart, I., Colombat, P., & Fouquereau, E. (2020). The forest and the trees:
Investigating the globability and specificity of employees’ basic need satisfaction at
work. Journal of Personality Assessment, 102(5), 702-713.
Hamaker, E.L., Mulder, J.D., van Ijzendoorn, M.H. (2020). Description, prediction and causation.
Methodologic challenges of studying child and adolescent development. Developmental
Cognitive Neuroscience, 46, 100867.
Hätinen, M., Kinnunen, U., Mäkikangas, A., Kalimo, R., Tolvanen, A., & Pekkonen, M. (2009).
Burnout during a long-term rehabilitation: Comparing low burnout, high burnout -benefited,
and high burnout -not benefited trajectories. Anxiety, Stress & Coping, 22(3), 341–360.
Healy, K. (2017). Fuck nuance. Sociological Theory, 35(2), 118-127.
Hofmans J. (2017). Modeling psychological contract violation using dual regime models: An event-
based approach. Frontiers in Psychology, 8, 1948.
Hofmans, J., Wille, B., & Schreurs, B. (2020). Person-centered methods in vocational research.
Journal of Vocational Behavior, 118, 103398.
Lawler, E.E. (1992). The ultimate advantage: Creating the high involvement organization. Jossey-Bass.
Mäkikangas, A., & Kinnunen, U. (2016). The person-oriented approach to burnout: A systematic
review. Burnout Research, 3(1), 11-23.
Marsh, H.W., & Hau, K-T. (1996). Assessing goodness of fit: Is parsimony always desirable? Journal
of Experimental Education, 64(4), 364-390.
Marsh, H.W., & Hau, K-T. (2007). Applications of latent-variable models in educational psychology: The
10
need for methodological-substantive synergies. Contemporary Educational Psychology, 32, 151-171.
Marsh, H.W., Hau, K.-T., Wen, Z., Nagengast, B., & Morin, A.J.S. (2013). 17. Moderation. In T.D. Little
(Ed.), Oxford Handbook of Quantitative Methods, Vol. 2 (pp. 361-386). Oxford University Press.
Marsh, H.W., Kuyper, H., Morin, A.J.S., Parker, P.D., & Seaton, M. (2014). Big-fish-little-pond
social comparison and local dominance effects: Integrating new statistical models,
methodology, design, theory and substantive implications. Learning & Instruction, 33, 50-66.
Marsh, H.W., Scalas, L.F., & Nagengast, B. (2010a). Longitudinal tests of competing factor structures
for the Rosenberg self-esteem scale. Psychological Assessment, 22(2), 366-381.
Marsh, H. W., Seaton, M., Kuyper, H., Dumas; F., Huguet, P., Regner, I., Buunk, A. P., Monteil, J.
M, Blanton, H., & Gibbons, F. X. (2010b). Phantom behavioral assimilation effects: Systematic
biases in social comparison choice studies. Journal of Personality, 78(2), 671-710.
McCormick, B.W., Reeves, C.J., Downes, P.E., Li, N., & Ilies, R. (2020). Scientific contributions of
within-person research in management. Journal of Management, 46(2), 321-350.
McLarnon, M.J.W., Carswell, J.J., & Schneider, T.J. (2015). A case of mistaken identity? Latent
profiles in vocational interests. Journal of Career Assessment, 23(1), 166-185.
Meyer, J.P., & Morin, A.J.S. (2016). A person-centered approach to commitment research: Theory,
research, and methodology. Journal of Organizational Behavior, 37(4), 584-612.
Morin, A.J.S., Blais, A.-R., & Chénard-Poirier, L.-A. (2021). Doubly latent multilevel procedures for
organizational assessment and prediction. Journal of Business and
Psychology. https://doi.org/10.1007/s10869-021-09736-5
Morin, A.J.S., Boudrias, J.-S., Marsh, H.W., McInerney, D.M., Dagenais-Desmarais, V., Madore, I.,
& Litalien, D. (2017). Complementary variable- and person-centered approaches to exploring
the dimensionality of psychometric constructs: Application to psychological wellbeing at
work. Journal of Business and Psychology, 32, 395-419.
Morin, A.J.S., Maïano, C., Marsh, H.W., Nagengast, B., & Janosz, M. (2013). School life and
adolescents’ self-esteem trajectories. Child Development, 84(6), 1967-1988.
Morin, A.J.S., Myers, N.D., & Lee, S. (2020). Modern factor analytic techniques: Bifactor models,
exploratory structural equation modeling (ESEM) and bifactor-ESEM. In G. Tenenbaum &
11
R.C. Eklund (Eds.), Handbook of Sport Psychology, 4th Edition (pp. 1044-1073). London, UK:
Wiley
Murphy, K. R. (2021). In praise of table 1: The importance of making better use of descriptive
statistics. Industrial and Organizational Psychology: Perspectives on Science and Practice,
14(4), XX–XX.
Navarro, J., Rueff-Lopes, R. & Rico, R. (2020). New nonlinear and dynamic avenues for the study of
work and organizational psychology: an introduction to the special issue. European Journal of
Work and Organizational Psychology, 29(4), 477-482.
O’Neill, T.A., McLarnon, M.J.W., & Carswell, J.J. (2015). Variance components of job performance
ratings. Human Performance, 28(1), 66-91.
O’Neill, T.A., McLarnon, M.J.W., Hoffart, G.C., Woodley, H.J., & Allen, N.J. (2018). The structure
and function of team conflict profiles. Journal of Management, 44(1), 811- 836.
Perinelli, E., & Alessandri, G. (2020). A latent state-trait analysis of global self-esteem: A
reconsideration of its state-like component in an organizational setting. International Journal of
Selection and Assessment, 28(4), 465-483.
Saylors. R. & Trafimow, D. (2021). Why the increasing use of complex causal models is a problem:
On the danger sophisticated theoretical narratives pose to truth. Organizational Research
Methods, 24(3), 616-629.
Sheldon, K., & Niemiec, C. (2006). It’s not just the amount that counts: Balanced need satisfaction
also affects well-being. Journal of Personality and Social Psychology, 91(2), 331-341.
Schreurs, B., Hofmans, J., & Wille, B. (2021). Multilevel modeling for careers research. In W. Murphy &
J. Tosti-Kharas (Eds.), Handbook for research methods in careers (pp. 210-234). Edward Elgar.
Solinger, O.N., van Olffen, W., Roe, R.A., & Hofmans, J. (2013). On becoming (un)committed: A
taxonomy and test of newcomer onboarding scenarios. Organization Science, 24(6), 1640–
1661.
Tay, L., Parrigon, S., Huang, Q., & LeBreton, J.M. (2016). Graphical descriptives: A way to improve
data transparency and methodological rigor in psychology. Perspectives on Psychological
Science, 11(5), 692–701.
12
Tóth-Király, I., Morin, A.J.S., Bőthe, B., Rigó, A., & Orosz, G. (2021). Toward an improved understanding
of work motivation profiles. Applied Psychology: An International Review, 70(3), 986-1017.
Tsoukas, H. (2017). Don't simplify, complexify: From disjunctive to conjunctive theorizing in organization
and management studies. Journal of Management Studies, 54(2), 132-153.
Woo, S.E., Tay, L., & Proctor, R.W. (Eds.). (2020). Big data in psychological research. American
Psychological Association.
Zhao, X., Lynch, J.G., & Chen, Q. (2010). Reconsidering Baron and Kenny: Myths and Trusts about
mediation analysis. Journal of Consumer Research, 37(2), 197-206.