Content uploaded by Markus Langer
Author content
All content in this area was uploaded by Markus Langer on Dec 22, 2020
Content may be subject to copyright.
Highly automated interviews: Applicant reactions and the organizational context
Markus Langera, Cornelius J. Königa, Diana R. Sanchezb, and Sören Samadia
aUniversität des Saarlandes
bSan Francisco State University
aMarkus Langer (https://orcid.org/0000-0002-8165-1803), aCornelius J. König
(https://orcid.org/0000-0003-0477-8293), and aSören Samadi, Fachrichtung Psychologie,
Universität des Saarlandes, Saarbrücken, Germany.
bDiana R. Sanchez, Workplace Technology Research Lab, San Francisco State University.
This work was supported by the German Federal Ministry for Education and Research
(BMBF; Project EmpaT, grant no. 16SV7229K). We thank the people from Charamel GmbH
for their continuous support and for providing us with the virtual character Gloria.
Correspondence concerning this article should be addressed to Markus Langer, Universität
des Saarlandes, Arbeits- & Organisationspsychologie, Campus A1 3, 66123 Saarbrücken,
Germany. Tel: +49 681 302 4767. E-mail: markus.langer@uni-saarland.de
This preprint version may not exactly replicate the final version published in the Journal of
Managerial Psychology. Coypright: Langer, M., König, C.J., Sanchez, D.R. & Samadi, S. (in
press). Highly automated interviews: Applicant reactions and the organizational context. Journal
of Managerial Psychology.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 2
ABSTRACT
Purpose: The technological evolution of job interviews continues as highly automated
interviews emerge as alternative approaches. Initial evidence shows that applicants react
negatively to such interviews. Additionally, there is emerging evidence that contextual
influences matter when investigating applicant reactions to highly automated interviews.
However, previous research has ignored higher-level organizational contexts (i.e., which kind
of organization uses the selection procedure) and individual differences (e.g., work
experience) regarding applicant reactions. This study therefore investigates applicant
reactions to highly automated interviews for students and employees and the role of the
organizational context when using such interviews.
Methodology: In a 2×2 online study, participants read organizational descriptions of either
an innovative or an established organization and watched a video displaying a highly
automated or a videoconference interview. Afterwards, participants responded to applicant
reaction items.
Findings: Participants (N = 148) perceived highly automated interviews as more consistent
but as conveying less social presence. The negative effect on social presence diminished
organizational attractiveness. The organizational context did not affect applicant reactions to
the interview approaches, whereas differences between students and employees emerged but
only affected privacy concerns to the interview approaches.
Research implications: The organizational context seems to have negligible effects on
applicant reactions to technology-enhanced interviews. There were only small differences
between students and employees regarding applicant reactions.
Practical implications: In a tense labor market, hiring managers need to be aware of a trad-
off between efficiency and applicant reactions regarding technology-enhanced interviews.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 3
Originality: This study investigates high-level contextual influences and individual
differences regarding applicant reactions to highly automated interviews.
Keywords: personnel selection, job interview technology, automatic applicant assessment,
applicant reactions, organizational context
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 4
Introduction
Organizations modernize their processes to stay up-to-date and to convey an
innovative and attractive image (Chapman and Webster, 2003; Gatewood et al., 1993). This
modernization affects many management processes as well as personnel selection procedures.
For instance, organizations use digital interviews (applicants record responses to interview
questions and send them to the hiring organization; Torres and Mejia, 2017) to show that they
are attractive employers (cf., Chapman and Webster, 2003). However, previous research in
the area of job interviews has predominantly found that technologically-advanced interviews
detrimentally affect applicant reactions (Blacksmith et al., 2016; Langer et al., 2017).
Nonetheless, the technological evolution of the interview continues. Currently, the use of
highly automated interviews is burgeoning (Langer et al., 2019). Within such interviews,
sensors (cameras, microphones) in combination with algorithms and virtual visualization
automate the entire interview process (i.e., acquire information about applicants, evaluate
applicants’ performance, implement actions such as automatic selection of follow-up
questions, using virtual interviewers, cf., Langer et al., 2019).
So far, little is known about the effects of highly automated interviews on applicant
reactions. A study by Langer et al. (2019) indicates that negative applicant reactions might
aggravate the more interviews are automated. Additionally, research has emerged showing
that task-level contextual influences affect people’s reactions to highly automated tools (e.g.,
people react more favorably to the automation of mechanical tasks [work scheduling]
compared to human tasks [hiring], M.K. Lee, 2018). Up to now, higher-level organizational
contexts have been ignored within these studies. More precisely, in previous designs,
participants either applied for a single (hypothetical) organization (e.g., Langer et al., 2019)
or did not receive any information about the hiring organization (e.g., Sears et al., 2013). Yet
in reality, applicants inform themselves about organizations, associate organizations with
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 5
specific attributes (e.g., established or innovative; cf., Slaughter and Greguras, 2009), and
evaluate organizational attractiveness based on their perceived person-organization fit
(Chapman et al., 2005). It follows that the negative effects of automated interviews may only
occur in situations where applicants do not expect innovative selection procedures implying
that highly automated interviews may be more accepted within selection processes of
innovative organizations.
The goals of the current study are therefore (a) to further investigate the effects of
highly automated interviews on applicant reactions, (b) to clarify if the organizational context
interacts with applicant reactions in a way that highly automated interviews are more
accepted for innovative organizations. Finally, this study addresses a limitation of prior
studies who used student samples (e.g. Langer et al., 2019) by collecting a more diverse
sample to investigate differences in reactions to automated tools between students and
employees. To achieve these goals, this study introduced students and full-time employees to
one of two organizations differing in their organizational description (innovative vs.
established organization) and to one of two interview approaches (highly automated
interview vs. videoconference interview).
Background and Hypotheses
Automation of job interviews
Technology for interviews has evolved in a way that established technology-mediated
interview approaches (e.g., videoconference interviews) appear rather old-fashioned. For
instance, within digital interviews, applicants respond via voice or video recordings (Langer
et al., 2017), and hiring managers can evaluate these recordings whenever they want. Using
machine learning algorithms, these recordings can also be evaluated automatically. For
example, the German company Precire automatically evaluates applicants’ voice recordings
(Precire, 2018), whereas the American company HireVue (HireVue, 2018) additionally
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 6
evaluates applicants’ nonverbal behavior (e.g., smiling). There have also been first attempts
to use virtual characters as interviewers to enhance the interpersonal touch of highly
automated interviews (cf., Langer et al., 2018; K.M. Lee and Nass, 2003).
These approaches have in common that they automate single parts of interviews up to
entire interview processes (Langer et al., 2019). Langer et al. (2019) used Parasuraman et al’s
(2000) model of automation to describe the underlying idea behind highly automated
interviews. Automating interviews includes four processes that can vary from low to high
levels of automation: acquiring information, analyzing information, selecting and deciding
about actions, and implementing these actions (Parasuraman et al., 2000). Acquiring
information is the automation of collecting and extracting data (e.g., record interviews;
automatically extract [non]verbal information). Analyzing information means to evaluate the
automatically acquired data (e.g., having algorithms that evaluate applicant performance).
Selecting and deciding about actions means to build on the information analysis to decide
about further steps (e.g., automatically selecting follow-up questions). The final step of
automation is then to automatically implement these actions (e.g., present follow-up
questions) and setting-up the interview (e.g., in a virtual environment). Similar to Langer et
al. (2019), this study introduces participants to a highly automated interview which includes
technology that allows high-level automation for every aforementioned step.
It is important to note that previous research tentatively supports the value of highly
automated interviews. For instance, Naim and colleagues (2015) found that automatic
evaluation of applicant behavior can predict interview performance. Nonetheless, there is
need for research regarding highly automated interviews which should expand from questions
regarding validity to ethical questions and investigating applicant reactions to such selection
procedures in different contexts (Langer et al., 2019).
Applicant reactions to automated interviews
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 7
Most applicant reaction research is based on Gilliland’s (1993) work on applicant
reactions to selection systems. Gilliland describes several procedural and distributive justice
rules. For instance, procedural justice is given if selection procedures appear job related, and
if they are administered to all applicants consistently. Distributive justice is given if
applicants perceive that they were able to influence outcomes of selection situations through
their behavior. Adhering to the aforementioned justice rules within selection procedures
should positively affect important outcomes such as organizational attractiveness (Chapman
et al., 2005). Since the current study investigates the reactions to different kinds of interview
procedures, it focuses on the procedural part of Gilliland’s model.
Previous research has shown that technology-enhanced interviews can enhance
efficiency and flexibility but can also lead to negative reactions (Blacksmith et al., 2016). For
instance, videoconference interviews are perceived as less fair and offering less opportunity
to perform than face-to-face interviews (Chapman et al., 2003; Sears et al., 2013).
Furthermore, highly automated interviews seem to evoke even less favorable reactions than
videoconference interviews (Langer et al., 2019) because of lower social presence.
Potosky’s (2008) framework of media attributes provides ideas about what
distinguishes different kinds of technology-enhanced interviews. Her framework covers four
attributes of communication media: social bandwidth, interactivity, transparency, and
surveillance. Social bandwidth describes the extent to which a medium allows to send and
receive verbal and nonverbal information (Potosky, 2008). Videoconference interviews
provide social bandwidth as applicants are able to exchange a variety of social signals
(Chapman et al., 2003). Although the current paper describes a highly automated interview
where applicants also send and receive communication information, even the most advanced
technology still lacks the opportunity to communicate equally as in-person interpersonal
communications. For example, it is possible to let a virtual character smile. However, this
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 8
smile might still appear less natural than a human smile (cf., Kätsyri et al., 2015). Therefore,
social bandwidth should be lower in highly automated interviews. Interactivity in Potosky’s
framework describes the possibility for direct interactions. As it is the case for social
bandwidth, highly automated interviews (especially versions with virtual characters) offer
interactivity to some extent. However, they are not as interactive as a conversation with a
human-being through videoconferencing. For instance, they provide less room for
backchanneling (e.g., nodding; Frauendorfer et al., 2014). The third aspect of Potosky’s
framework is transparency. Communication media are transparent if there are no (technical)
issues during communication and if people can ignore the fact that they are using media to
communicate (Potosky, 2008). Highly automated interviews likely reminded people that they
are using media to communicate, whereas in the course of videoconference interviews, the
medium might become less apparent when applicants start to concentrate on the interviewer
(Langer et al., 2017). The last aspect of Potosky’s framework is surveillance. It constitutes
perceptions of how likely it is for a third party to access information about the
communication between communication parties (Potosky, 2008). For example, applicants
might fear that recordings of highly automated interviews are later watched by unauthorized
people (Langer et al., 2017).
Furthermore, highly automated and videoconference interviews might differ regarding
the interview set-up. This study follows the example of Langer et al. (2019) and uses a virtual
set-up with a virtual character. Even if virtual characters should affect social presence
positively (K.M. Lee and Nass, 2003), humans in videoconference interviews might still
convey more social presence. Furthermore, there might be negative effects if perceptions of a
virtual character fall into the uncanny valley (Mori et al., 2012). The uncanny valley
hypothesis proposes that realistic virtual characters might evoke negative feelings in humans,
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 9
possibly because there is a perceptual mismatch for the virtual characters’ behavior or
appearance (e.g., strange body proportions; Kätsyri et al., 2015).
Potosky’s framework suggests that highly automated interviews offer less social
bandwidth, interactivity, and transparency, and there might be more pronounced feelings of
surveillance and a virtual interviewer might also negatively affect applicant reactions. This
predominantly points to negative applicant reactions towards highly automated interviews.
There might be negative consequences regarding perceived social presence within highly
automated interviews. Humans perceive social presence if there is interpersonal warmth and
empathy during an interaction (Walter, Ortbach, & Niehaves, 2015). Usually, they convey
this through the exchange of nonverbal communication (Chapman et al., 2003). Therefore,
applicants might feel less social presence in highly automated interviews. Furthermore, the
differences in Potosky’s aspects of transparency and surveillance might build into greater
concerns from applicants on privacy issues (i.e., concerns about data abuse; Smith et al.,
2011). If applicants are constantly aware that they will submit videos of themselves to an
organization without knowledge of who will access them, this might raise privacy concerns
(Langer et al., 2017).
In spite of these potential negative effects of highly automated interviews on applicant
reactions, people also tend to believe that computers are more objective than humans (Miller
et al., 2018). Applicants might believe that highly automated interviews should make no
difference in how they “treat” applicants compared to videoconference interviews, where
interviewers’ behaviors are likely influenced by applicants’ characteristics (e.g., appearance;
Pingitore et al., 1994).
Based on the aforementioned theoretical assumptions, we propose:
Hypothesis 1: Participants will evaluate the highly automated interview as conveying less
social presence, being more consistent but evoking more privacy concerns.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 10
Contextual influences on applicant reactions
Support for Hypothesis 1 would replicate the findings by prior research (e.g., Langer
et al. 2019; M.K. Lee, 2018). To advance research and inform practice, this study builds on
emerging research regarding contextual influences on applicant reactions. M.K. Lee showed
that people prefer human influence during human tasks and chose hiring and work
evaluations as examples for human tasks (in comparison to work scheduling and work
assignment as mechanical tasks). Langer et al. found that people favor automated tools for
low-stake tasks (i.e., training vs. personnel selection). Consequently, the implications of these
studies remain on the task level whereas they ignored important higher-level influences such
as the organizational context. If the organizational context affects people’s reactions this
could lead to valuable insights because these effects might translate to many tasks within the
organizational context.
Applicants ascribe attributes to organizations (Slaughter and Greguras, 2009). One
important attribute of organizations which might affect which selection procedures they use is
innovation (Lievens and Highhouse, 2003). Original and creative organizations whose
success relies on technological innovation might be examples for innovative organizations
(Slaughter et al., 2004). They might operate in dynamic markets, where they need to adapt to
technological changes (e.g., computer science). In comparison, other organizations might be
perceived as more stable and established – organizations that operate in industry sectors in
which continuity is valued by customers (e.g., insurance industry) (Slaughter and Greguras,
2009) and where there is also potentially less innovation. Previous research suggests that the
organizational context is a determining factor for applicants’ attraction to a company
(Chapman et al., 2005). One reasons is that applicants feel attraction to organizations where
they perceive themselves as a good fit (i.e., person-organization fit; Chapman et al., 2005).
For instance, some applicants feel attracted to innovative organizations because they like the
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 11
dynamic environment which requires adaptation to changes, whereas others prefer more
stable environments within established organizations (Slaughter and Greguras, 2009).
Despite the importance of the organizational context, it was not only the studies by
M.K. Lee (2018) and Langer et al. (2019) that left it out of their studies. Similarly, Sears and
colleagues (2013) did not provide any details on the hiring organization and evaluated
participants’ reactions to different interview approaches. In application situations, however,
applicants use multiple sources to learn about the organization and to determine
organizations’ attractiveness as employers (Chapman et al., 2005). Applicants inform
themselves through the organizations’ homepage, job ads, and the selection procedures
organizations use (Gatewood et al., 1993). For instance, applicants perceive organizations
using digital interviews as more innovative (Torres and Mejia, 2017).
Even more importantly, there might be cases where applicants’ perceptions of an
organization and applicants’ perceptions of the applied selection procedures diverge
(Gatewood et al., 1993). For instance, before applicants enter a selection situation, they might
expect an organization to be rather traditional. If they are then confronted with innovative
selection procedures (e.g., highly automated interviews), this could violate applicants’
expectations of the organization, as they had expected an established organization to also use
established selection procedures (e.g., face-to-face interviews). Thus, the perceived job
relatedness of a selection procedure might differ depending on the organizational context. In
the case of innovative organizations, applicants might believe that innovative selection
approaches tell something about the future job at this organization. As attraction to
organizations depends on perceived person-organization fit (Chapman et al., 2005),
applicants in search for an established (innovative) organizational environment might be
irritated by experiencing an innovative (established) selection approach, which could then
negatively affect organizational attractiveness. Thus, we propose:
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 12
Hypothesis 2: Participants will perceive selection procedures as more job related and the
organization as more attractive when the selection procedure matches the perceived attributes
of an organization (e.g., highly automated selection in the selection process of an innovative
organization).
Finally, this study addresses a limitation from prior research. Previous studies used
student samples to investigate applicant reactions to technology-enhanced interviews (e.g.,
Sears et al. 2013) and there is speculation if reactions to highly automated interviews might
diverge between students and the working population (Langer et al., 2019). It might be that
students are more open for technology-enhanced approaches; they might believe that such
approaches are more job related than people who already have a job; and there might be
differences in privacy concerns because of underlying individual differences between
students and worker (cf., Bauer et al., 2006). As most of these assumptions are speculative we
therefore ask:
Research Question 1: Do students and worker react differently to highly automated
interviews?
Method
Sample
The authors consulted prior research by Blacksmith et al. (2016) and Langer et al.
(2017) to determine the required sample size. They found small to medium effect sizes for
applicant reactions comparing different forms of technology-enhanced interviews. Sample
size calculation with G*Power (Faul et al., 2009) revealed that under the assumption of a
small to medium effect (η²p between .04-.06), a sample between N = 125-191 would be
necessary for a power of 1-β = .80. Participants were recruited through social media and
direct contact. Data collection continued until the sample consisted of N = 154 participants.
Six participants were excluded from data analysis (e.g., not reading carefully, pausing the
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 13
experiment). The final sample consisted of N = 148 German participants (61% female). Of
these, 54% were employed in full-time jobs (those were the employees within the
independent variable students vs. employees), 39% were students (82% of these studied
psychology) and the rest were either apprentices or high school students. The mean age was
28.80 years (SD = 7.18). The majority of participants (36%) had experienced one to three job
interviews before, 28% had experienced four to five interviews, and 36% had experienced
more than six interviews. Furthermore, 74% of participants had experienced at least one
videoconference interview before. Student participants received course credit and all
participants had the chance to win a gift certificate for online shopping.
Design, procedure, and manipulation
In a 2×2 design (videoconference vs. highly automated interview; established vs.
innovative organization), participants visited an online survey platform and were randomly
assigned to one of the conditions. Afterwards, they had to imagine that they applied for a job.
Then, they received the respective description of the organization (Table 1). The descriptions
differ in 14 text elements that were designed to either reflect an established or an innovative
organization. A pre-test with N = 59 participants (ninnovative = 32, nestablished = 27) was
conducted to verify that organizational attraction to these descriptions was perceived
similarly. Participants in the pre-test were randomly assigned to one of the descriptions and
answered to the same items for organizational attractiveness as the participants in the main
study. Results indicate that the organizations were perceived similarly attractive (Minnovative =
3.46, SDinnovative = 0.61, Mestablished = 3.60, SDestablished = 0.66), t(57) = 0.84, p = .40, d = 0.22.
After reading the description in the main study, participants watched a video showing
either the videoconference or the highly automated interview. Both videos were taken from
Langer and colleagues (2019). They were edited in a way that participants watched a female
virtual/human interviewer interacting with a female applicant, who was only present through
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 14
voice. The virtual/human interviewer asked two questions. In the second question, the
applicant became nervous (i.e., hesitates to answer) and the respective interviewer said that
she noticed nervousness and tried to calm the interviewee (see also Langer et al., 2018,
2019). This way participants should realize that all parts of the interview have been
automatized, as described by Parasuraman et al. (2001) and Langer et al. (2019). Afterwards,
the applicant recovered and responded to the interview question. Then, the video faded out
without any further information about the outcome of the interview.
Measures
Participants responded to the items on a scale from 1 (strongly disagree) to 5
(strongly agree).
Social presence was measured with five items from Walter and colleagues (2015). A
sample item is “The interviewer acted empathically.”
Privacy concerns were measured with seven items, two were taken from Langer and
colleagues (2018) and five were taken from Langer and colleagues (2017). A sample item is
“Situations like the one shown threaten applicants’ privacy.”
Consistency and job relatedness were measured with three items each from a
German version of the Selection Procedural Justice Scale (Bauer et al., 2001; Warszta, 2012).
A sample item for consistency is “This procedure is administered to all applicants in the same
way.” A sample item for job relatedness is “Doing well in this interview means that a person
will also be good in the job.”
Organizational attractiveness was measured with twelve items from Highhouse and
colleagues (2003) and three more from Warszta (2012). A sample item is “This organization
is attractive.”
Manipulation check measure
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 15
To check if participants perceived the organizational description as intended (i.e.,
established vs. innovative), the item “The organization described itself as an innovative
organization” was included.
Results
Manipulation check
Participants in the innovative organization condition were more likely to perceive the
organization as innovative, t(126.13) = 10.87, p < .01, d = 1.81.
Testing the hypotheses
Table 2 provides an overview of descriptive statistics and correlations. Table 3 shows
the results of the ANOVAs. Hypothesis 1 stated that participants would evaluate the highly
automated interview as conveying less social presence, more privacy concerning, and more
consistent. Results of the ANOVA indicated that participants perceived highly automated
interviews as slightly to moderately more consistent and as providing slightly to moderately
less social presence. There was no significant difference for privacy concerns. Hypothesis 1
was therefore partially supported.
Hypothesis 2 stated that participants would perceive selection procedures as more job
related and the organization as more attractive when the selection procedure matches the
perception of the organization. The ANOVAs revealed no interaction effects, suggesting that
a match of the selection procedure and the organizational image did not affect job relatedness
or organizational attractiveness.
Research Question 1 asked if students and employees differ in their reactions to
highly automated interviews. Therefore, we coded if participants were working and included
this as an independent variable into the aforementioned ANOVAs. There were no significant
main effects for the difference between students and employees. The only significant effect
was an interaction between the independent variables students versus employees and the
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 16
interview type regarding privacy concerns. Students reported higher privacy concerns for
highly automated interviews whereas employees reported higher privacy concerns for
videoconference interviews, F(1, 140) = 1.80, p < .05, η²p = 0.03. However, this result should
be interpreted cautiously because when controlling for alpha-error accumulation (using
Bonferroni-correction) for five dependent variables, the interaction effect did not remain
significant (exact p = 0.028 compared to a corrected α = 0.01).
Additional results
The results indicated that organizational attraction was lower for highly automated
interviews. To further investigate this result, a mediation analysis was conducted using
PROCESS (Hayes, 2013). Following suggestions by Hayes (2013), social presence, privacy
concerns, job relatedness and consistency were included as mediators and organizational
attractiveness as outcome variable. Integrating the findings of Tables 4 and 5 indicated that
the negative indirect effect of the highly automated interview on organizational attractiveness
through social presence was significant. This means that participants perceived organizational
attractiveness resulting from the use of a highly automated interview to be lower because it
conveyed less social presence.
Discussion
The aims of the current study were to (a) further investigate applicant reactions to
highly automated interviews, (b) examine influences of the organizational context on
reactions to technology-enhanced interview approaches, and (c) explore potential differences
between students and employees regarding reactions to highly automated interviews. On the
positive side for highly automated interviews, they were perceived as more consistent than
videoconference interviews. However, participants also perceived highly automated
interviews to provide less social presence which negatively affected organizational
attractiveness. Moreover, this finding was independent from the organizational context,
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 17
implying that a match of the perceptions of an organizational and its selection procedures did
not lead to better perceptions of innovative interview approaches. Lastly, the results
tentatively indicate that there might be small differences between students’ and employees’
privacy concerns within the different interview formats.
First, the findings support assumptions that people perceive machines to be more
consistent (Miller et al., 2018). Participants in the current study appeared to believe that
applicants in highly automated interviews might be treated more consistently than in
videoconference interviews. Therefore, variables related to consistency might be interesting
for future research on highly automated selection procedures. For instance, objectivity of the
outcomes could be another variable that applicants favor in highly automated selection, as the
human influence on selection decisions is minimized which could increase distributive justice
perceptions (i.e., feelings that outcomes are fair; Gilliland, 1993).
However, participants thought that the organization using the highly automated
interviews was less attractive because they perceived less social presence. This supports the
assumption that the highly automated interview offered less social bandwidth and
interactivity as defined by Potosky (2008), replicates findings by Langer et al. (2019), and is
in line with previous findings that show that higher-levels of automation reduce interpersonal
justice perceptions (Langer et al., 2017). Consequently, evidence from different studies
indicates that technology-enhanced interviews are less accepted because of the feature that
makes them more efficient and potentially more consistent: because of minimizing the human
influence. Hiring managers need to decide between efficiency and applicant reactions when
they choose between different interview approaches.
Furthermore, students and employees differed regarding privacy concerns within the
interview formats. Privacy concerns rise when people are uncertain about what happens to
their data (Smith et al., 2011). Consequently, students seemed to be more uncertain about the
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 18
use of their data in highly automated interviews, whereas for employees this was the case for
videoconference interviews. Even though this finding should be interpreted cautiously
because of potential alpha-error inflation, it suggests that work experience may play a role
when it comes to privacy concerns in different versions of technology-enhanced interviews
and calls for future research. It is also important to mention, that there were no differences
between students and employees for any other outcome variables which might be good news
for interpreting previous applicant reactions studies (findings likely also generalize to more
mature samples) and future research in this field (no exaggerated need for worries when
planning to use student samples).
Finally, this study investigated if the match of perceptions of organizations and its
selection procedures affects applicant reactions (cf., Gatewood et al., 1993). It appears that
participants did not like the highly automated interview regardless of imagining applying for
an innovative or an established organization. Therefore, the assumption that the
organizational context is a higher-level contextual effect that affects applicant reactions and
that might be more widely applicable compared to task-level effects found by M.K. Lee
(2018) and Langer et al. (2019) has to be currently dismissed.
Limitations
First, participants were not personally interviewed through a videoconference or an
automated interview. Therefore, findings could differ for live interviews (but see Langer et
al., 2019). Nevertheless, watching a video should be more immersive than merely imagining
being in an interview situation which is another common approach in studies using vignettes
(Atzmüller and Steiner, 2010). Second, it is not yet clear which role the virtual character
played in the results of the current study. Following assumptions from human-computer
interaction research (e.g., K.M. Lee and Nass, 2003), omitting the virtual character would
have decreased the social presence of the highly automated interaction even more. This calls
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 19
for further research regarding various forms of highly automated interviews. Third,
participants might have perceived videoconference interviews as “innovative” selection tools
which could explain the non-significant findings regarding the interaction between the
organizational description and the interview approach. However, if the match of the
organization and its selection procedures had led to better applicant reactions, perceiving both
interview approaches as innovative would have led to a main effect for established vs.
innovative organizations in a way that innovative organizations should have been perceived
more favorably – this was not the case.
Main practical implications
Organizations seem to be well-advised to check their existing selection approaches
regarding applicant reactions. Even if management automation systems enhance efficiency,
organizations should pay attention to the possible detrimental effects on their applicant pool.
This is especially true in times of a tense labor market where every applicant is a potential
market advantage because applicants might withdraw their application if they are not satisfied
with the way the interview process is handled and potentially advice friends against applying
for a job at an organization that uses automated interviews (Langer et al., 2019; Uggerslev et
al., 2012). Considering the results of the current study, not even innovative organizations
should rely on their image to buffer the negative reactions to technology-enhanced interview
approaches. Finally, organizations should consider which interview tools they use within a
given applicant pool as the current results may imply that applicants with different levels of
work experience seem to react differently to different forms of technology-enhanced
interviews.
Future research
It is still unclear how applicants behave when they are personally confronted with
highly automated interviews. For instance, it might be possible that applicants are less
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 20
motivated, use less impression management, or provide qualitatively different answers, which
might affect the validity of highly automated interviews (Blacksmith et al., 2016).
Additionally, research regarding contextual influences on reactions to automated tools is
growing and there are still many open questions. For instance, the Europe Union introduced
the General Data Protection Regulation (GDPR) in 2018 and other countries are working on
comparable regulations regarding data security, gathering, and evaluation. It is unclear how
such regulations affect people’s trust in organizations using automated tools and algorithms.
Potentially, privacy concerns regarding automated tools are affected by such regulations and
would therefore be different in the US or in China. Similarly, it is unclear how the use of
highly automated tools will affect management practice, law-making and societies as a
whole. Even within the rather narrow field of technology-enhanced interviews many legal,
moral, and ethical questions arise (cf. Zerilli et al., 2018). For instance, how can applicants be
sure that only valid and unbiased information is considered in automatic processes? How can
hiring manager explain their decisions to applicants who were rejected by automated tools?
How to enable people to understand automated tools?
Conclusion
As practice continues to develop innovative selection procedures, it is crucial that this
drive does not outpace scientific understanding of these methods. This study showed that
potential applicants perceive highly automated interviews and organizations using such
interviews as rather negatively and raised questions about contextual and individual
influences regarding applicant reactions. Yet it is merely a small step in catching up with the
ongoing automation of management processes. Hopefully, this study encourages more
scholars to research the emerging intersection between management and computer science.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 21
References
Atzmüller, C. and Steiner, P.M. (2010), “Experimental vignette studies in survey research”,
Methodology, Vol. 6, pp. 128–138. doi:10.1027/1614-2241/a000014
Bauer, T.N., Truxillo, D.M., Sanchez, R.J., Craig, J.M., Ferrara, P., and Campion, M.A.,
(2001), “Applicant reactions to selection: Development of the Selection Procedural
Justice Scale (SPJS)”, Personnel Psychology, Vol. 54, pp. 387–419.
doi:10.1111/j.1744-6570.2001.tb00097.x
Bauer, T. N., Truxillo, D. M., Tucker, J. S., Weathers, V., Bertolino, M., Erdogan, B., and
Campion, M. A. (2006), “Selection in the information age: The impact of privacy
concerns and computer experience on applicant reactions”, Journal of Management,
Vol 32, pp. 601–621. doi:10.1177/0149206306289829
Blacksmith, N., Willford, J.C., and Behrend, T.S. (2016), “Technology in the employment
interview: A meta-analysis”, Personnel Assessment and Decisions, Vol. 2, 2.
doi:10.25035/pad.2016.002
Chapman, D.S., Uggerslev, K.L., Carroll, S.A., Piasentin, K.A., and Jones, D.A. (2005),
“Applicant attraction to organizations and job choice: A meta-analytic review of the
correlates of recruiting outcomes”, Journal of Applied Psychology, Vol. 90, pp. 928–
944. doi:10.1037/0021-9010.90.5.928
Chapman, D.S., Uggerslev, K.L., and Webster, J. (2003), “Applicant reactions to face-to-face
and technology-mediated interviews: A field investigation”, Journal of Applied
Psychology, Vol. 88, pp. 944–953. doi:10.1037/0021-9010.88.5.944
Chapman, D.S. and Webster, J. (2003), “The use of technologies in the recruiting, screening,
and selection processes for job candidates”, International Journal of Selection and
Assessment, Vol. 11, pp. 113–120. doi:10.1111/1468-2389.00234
Frauendorfer, D., Schmid Mast, M., Nguyen, L., and Gatica-Perez, D. (2014), “Nonverbal
social sensing in action: Unobtrusive recording and extracting of nonverbal behavior
in social interactions illustrated with a research example”, Journal of nonverbal
behavior, Vol. 38, pp. 231–245. doi:10.1007/s10919-014-0173-5
Gatewood, R.D., Gowan, M.A., and Lautenschlager, G.J. (1993), “Corporate image,
recruitment, and initial job choice decisions”, Academy of Management Journal, Vol.
36, pp. 414–427. doi:10.2307/256530
Gilliland, S.W. (1993) “The perceived fairness of selection systems: An organizational justice
perspective”, Academy of Management Review, Vol. 18, pp. 694–734.
doi:10.2307/258595
Hayes, A.F. (2013), Introduction to mediation, moderation, and conditional process analysis:
A regression-based approach, Guilford Press, New York, NY.
Highhouse, S., Lievens, F., and Sinar, E.F. (2003), “Measuring attraction to organizations”,
Educational and Psychological Measurement, Vol. 63, pp. 986–1001.
doi:10.1177/0013164403258403
HireVue (2018), HireVue OnDemand available at https://www.hirevue.com/products/video-
interviewing/ondemand (accessed 8 July 2018).
Kätsyri, J., Förger, K., Mäkäräinen, M., and Takala, T. (2015), “A review of empirical
evidence on different uncanny valley hypotheses: Support for perceptual mismatch as
one road to the valley of eeriness”, Frontiers in Psychology, Vol 6, 390.
doi:10.3389/fpsyg.2015.00390
Langer, M., König, C.J., and Fitili, A. (2018), “Information as a double-edged sword: The
role of computer experience and information on applicant reactions towards novel
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 22
technologies for personnel selection”, Computers in Human Behavior, Vol. 81, pp.
19–30. doi:10.1016/j.chb.2017.11.036
Langer, M., König, C.J., and Krause, K. (2017), “Examining digital interviews for personnel
selection: Applicant reactions and interviewer ratings”, International Journal of
Selection and Assessment, Vol. 25, pp. 371–382. doi:10.1111/ijsa.12191
Langer, M., König, C. J., & Papathanasiou, M. (2019). „Highly automated job interviews:
Acceptance under the influence of stakes.” International Journal of Selection and
Assessment, Vol. 27, pp. 217-234. doi:10.1111/ijsa.12246
Lee, K.M., and Nass, C. (2003), “Designing social presence of social actors in human
computer interaction”, in: Proceedings of the CHI 2003 Conference on Human
Factors in Computing Systems, Fort Lauderdale, FL, pp. 289–296.
doi:10.1145/642611.642662
Lee, M.K. (2018), “Understanding perception of algorithmic decisions: Fairness, trust, and
emotion in response to algorithmic management”. Big Data & Society, 5,
205395171875668. doi:10.1177/2053951718756684
Lievens, F. and Highhouse, S. (2003), “The relation of instrumental and symbolic attributes
to a company’s attractiveness as an employer”, Personnel Psychology, Vol. 56, pp.
75–102. doi:10.1111/j.1744-6570.2003.tb00144.x
Miller, F.A., Katz, J.H., and Gans, R. (2018), “The OD imperative to add inclusion to the
algorithms of artificial intelligence”, OD Practitioner, Vol. 50 (1), pp. 6–12.
Mori, M., MacDorman, K., and Kageki, N. (2012), “The uncanny valley”, IEEE Robotics &
Automation Magazine, Vol 19, pp. 98–100. doi:10.1109/MRA.2012.2192811
Naim, I., Tanveer, M.I., Gildea, D., and Hoque, M.E. (2015), “Automated analysis and
prediction of job interview performance: The role of what you say and how you say
it”, in: 11th IEEE International Conference and Workshops on Automatic Face and
Gesture Recognition. Ljubljana, Slovenia, pp. 1–14. doi:10.1109/fg.2015.7163127
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of
human interaction with automation. IEEE Transactions on Systems, Man, and
Cybernetics - Part A: Systems and Humans, Vol. 30, pp. 286–297.
doi:10.1109/3468.844354
Pingitore, R., Dugoni, B.L., Tindale, R.S., and Spring, B. (1994), “Bias against overweight
job applicants in a simulated employment interview”, Journal of Applied Psychology,
Vol. 79, pp. 909–917. doi:10.1037/0021-9010.79.6.909
Potosky, D., (2008), “A conceptual framework for the role of the administration medium in
the personnel assessment process”, Academy of Management Review, Vol. 33, pp.
629–648. doi:10.5465/amr.2008.32465704
Precire (2018), Precire Technologies, available at https://www.precire.com/de/start/ (accessed
8 June 2018).
Sears, G., Zhang, H., Wiesner, W.D., Hackett, R.W., and Yuan, Y. (2013), “A comparative
assessment of videoconference and face-to-face employment interviews”,
Management Decisions, Vol. 51, pp. 1733–1752. doi:10.1108/MD-09-2012-0642
Slaughter, J.E. and Greguras, G.J. (2009), “Initial attraction to organizations: The influence
of trait inferences”, International Journal of Selection and Assessment, Vol. 17, pp.
1–18. doi:10.1111/j.1468-2389.2009.00447.x
Slaughter, J.E., Zickar, M.J., Highhouse, S., and Mohr, D.C. (2004), “Personality trait
inferences about organizations: Development of a measure and assessment of
construct validity”, Journal of Applied Psychology, Vol. 89, pp. 85–103.
doi:10.1037/0021-9010.89.1.85
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 23
Smith, H.J., Dinev, T., and Xu, H. (2011), “Information privacy research: An
interdisciplinary review”, Management Information Systems Quarterly, Vol 35, pp.
989–1015. doi:10.2307/41409970
Torres, E.N. and Mejia, C. (2017), “Asynchronous video interviews in the hospitality
industry: Considerations for virtual employee selection”, International Journal of
Hospitality Management, Vol. 61, pp. 4–13. doi:10.1016/j.ijhm.2016.10.012
Uggerslev, K.L., Fassina, N.E., and Kraichy, D. (2012), “Recruiting through the stages: A
meta-analytic test of predictors of applicant attraction at different stages of the
recruiting process”, Personnel Psychology, Vol. 65, pp. 597–660. doi:10.1111/j.1744-
6570.2012.01254.x
Walter, N., Ortbach, K., and Niehaves, B. (2015), “Designing electronic feedback: Analyzing
the effects of social presence on perceived feedback usefulness”, International
Journal of Human-Computer Sudies, Vol. 76, pp. 1–11.
doi:10.1016/j.ijhcs.2014.12.001
Warszta, T. (2012), “Application of Gilliland’s model of applicants’ reactions to the field of
web-based selection”, Unpublished dissertation, Christian-Albrechts Universität Kiel,
Germany.
Zerilli, J., Knott, A., Maclaurin, J., and Gavaghan, C. (2018), “Transparency in algorithmic
and human decision-making: Is there a double standard?”, Philosophy & Technology,
advance online publication. doi:10.1007/s13347-018-0330-6
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 24
Table 1.
Information Presented to the Participants in the Different Organizational Descriptions
Fuchs&Schulz Automotive is one of the oldest (an aspiring) and established (innovative)
automotive suppliers in Germany. The established (progressive) organization with subsidiaries in
China, Japan, the US, Switzerland and the Netherlands expanded since its foundation in 1954
constantly (2001 rapidly). As a family business in the third generation (former start-up) with
more than 1.300 employees and its headquarters in Frankfurt, F&S reached a revenue of 125.2
million Euro in 2016.
F&S offers a classical (innovative) and established (future-oriented) supplier concept and
substantial knowledge in the areas of logistics, sustainability and service. This way F&S
consolidated its position as internationally successful (a global player) and experienced company
(driver of innovation) on the market.
People are the focus of F&S – as customers, partners, and employees. Trust and security
(innovation and creativity) is the groundwork of the traditional (future-oriented) organizational
culture. This crystallizes in stable long-term (dynamical and fruitful) relations with the
customers.
Since the foundation of the company, F&S lives tradition (creativity). This way we build a bridge
between reliable consultancy and proximity to customers. Through extensive (interdisciplinary)
project experience, we ensure that we can provide you with consultants that exactly know and
understand the target market (penetrate the target market through constant progress).
Note. Information translated from German. Italic text pieces reflect the manipulation for the
traditional organization (not in italics in the original material for the participants). Text pieces in
brackets reflect the manipulation for the innovative organization.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 25
Table 2.
Correlations and Cronbach’s Alpha for the Study Variables.
Scale
1.
2.
3.
4.
5.
6.
7.
8.
1.
Social Presence
.91
2.
Privacy Concerns
-.19*
.78
3.
Consistency
-.13
.04
.88
4.
Job Relatedness
.36**
-.12
.00
.89
5.
Organizational Attractiveness
.60**
-.20*
-.09
-.33**
.95
6.
Students vs. Employees
.00
.02
-.04
-.13
-.06
-
7.
Interview
-.23**
.02
-.20*
-.10
-.19*
.12
-
8.
Organization
-.09
.04
-.04
.08
-.11
-.13
-.05
-
Note. Coding of students vs. employees: -1 = students, 1 = employees, Coding of interview: -1 =
videoconference, 1 = highly automated. Coding of organization: -1 = established organization, 1 =
innovative organization. N = 148. Numbers along the diagonal represent the Cronbach’s alpha of
the scales.
*p < .05, **p < .01.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 26
Table 3.
Means, Standard Deviations, and ANOVA Results (including partial η²) for the Dependent Variables.
Condition
ANOVA
VC-ES
VC-IN
AI-ES
AI-IN
VC vs. AI
ES vs. IN
Interaction
Variable
M
(SD)
M
(SD)
M
(SD)
M
(SD)
F(1, 144)
η²p
F(1, 144)
η²p
F(1, 144)
η²p
Social Presence
3.07
(0.72)
2.99
(0.88)
2.77
(0.92)
2.48
(0.93)
7.96**
.05
1.15
.01
0.56
.00
Privacy Concerns
3.12
(0.51)
3.09
(0.63)
3.07
(0.71)
3.19
(0.57)
0.03
.00
0.21
.00
0.49
.00
Consistency
3.08
(0.73)
3.12
(0.87)
3.51
(0.94)
3.37
(0.67)
6.32*
.04
0.11
.00
0.44
.00
Job Relatedness
2.49
(0.91)
2.43
(0.79)
2.15
(0.76)
2.45
(0.78)
1.38
.01
0.82
.00
1.81
.01
Organizational Attraction
3.32
(0.53)
3.22
(0.66)
3.11
(0.74)
2.87
(0.79)
5.91*
.04
2.38
.02
0.40
.00
Note. VC = videoconference interviews condition, AI = highly automated interviews condition, ES = established organization condition,
IN = innovative organization condition. nVC-ES = 34, nVC-IN = 41, nAI-ES = 37, nAI-IN = 36.
*p < .05, **p < .01.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 27
Table 4.
Regression Results for the Mediation of the Hypothesized Mediators between the Videoconference vs. Highly automated Interview Condition and
Organizational Attractiveness
Model
R2
Coefficient
SE
p
95% Confidence Interval
Single models
VC vs. AI → Social Presence
.05
-.20
0.07
<.01
[-.34, -.06]
VC vs. AI → Privacy Concerns
.00
.01
0.05
.85
[-.09, .11]
VC vs. AI → Consistency
.04
.16
0.07
<.05
[.04, .30]
VC vs. AI → Job Relatedness
.01
-.08
0.07
.24
[-.21, .05]
VC vs. AI → Organizational Attractiveness
.04
-.14
0.06
<.05
[-.25, -.02]
Model complete
.39
-
-
<.01
-
Social Presence → Organizational Attractiveness
.42
0.06
<.01
[.30, .53]
Privacy Concerns → Organizational Attractiveness
-.10
0.08
.19
[-.26, .05]
Consistency → Organizational Attractiveness
-.01
0.06
.89
[-.12, .11]
Job Relatedness → Organizational Attractiveness
.11
0.60
.09
[-.02, .23]
VC vs. AI → Organizational Attractiveness
-.04
0.05
.39
[-.14, .05]
Note. AI = highly automated condition, VC = videoconference condition. Coding of the variable VC vs. AI: -1 = videoconference interview
condition, 1 = highly automated interview condition. The 95% confidence interval for the effects is obtained by the bias-corrected bootstrap with
10,000 resamples.
AUTOMATED INTERVIEWS AND THE ORGANIZATIONAL CONTEXT 28
Table 5.
Results for the Indirect Effects of Videoconference vs. Highly automated Interview Condition over the Hypothesized Mediators on
Attractiveness of the Procedure
Model
IEmed
SEBoot
95% Confidence Interval
Complete indirect effect
-.13
0.05
[-.24, -.03]
VC vs. AI → Social Presence → Organizational Attractiveness
-.12
0.05
[-.22, -.04]
VC vs. AI → Privacy Concerns → Organizational Attractiveness
.00
0.01
[-.03, .01]
VC vs. AI → Consistency → Organizational Attractiveness
.00
0.02
[-.03, .03]
VC vs. AI → Job Relatedness → Organizational Attractiveness
-.01
0.01
[-.05, .01]
Note. VC = videoconference interview condition, AI = highly automated interview condition. Coding of the variable VC vs. AI: -1 = video
conference condition, 1 = highly automated condition. The 95% confidence interval for the effects is obtained by the bias-corrected bootstrap
with 10,000 resamples. IEmed = completely standardized indirect effect of the mediation. SEBoot = standard error of the bootstrapped effect
sizes.