Content uploaded by Russell Cropanzano
Author content
All content in this area was uploaded by Russell Cropanzano on Oct 13, 2017
Content may be subject to copyright.
Deontic Justice and Organizational Neuroscience
Russell S. Cropanzano
1
•Sebastiano Massaro
2
•William J. Becker
3
Received: 25 June 2014 / Accepted: 28 January 2016
Springer Science+Business Media Dordrecht 2016
Abstract According to deontic justice theory, individuals
often feel principled moral obligations to uphold norms of
justice. That is, standards of justice can be valued for their
own sake, even apart from serving self-interested goals.
While a growing body of evidence in business ethics
supports the notion of deontic justice, skepticism remains.
This hesitation results, at least in part, from the absence of
a coherent framework for explaining how individuals
produce and experience deontic justice. To address this
need, we argue that a compelling, yet still missing, step is
to gain further understanding into the underlying neural
and psychological mechanisms of deontic justice. Here, we
advance a theoretical model that disentangles three key
processes of deontic justice: The use of justice rules to
assess events, cognitive empathy, and affective empathy.
Together with reviewing neural systems supporting these
processes, broader implications of our model for business
ethics scholarship are discussed.
Keywords Affect and cognition Deonance Deontic
justice Empathy Organizational justice Workplace
fairness Organizational neuroscience
Abbreviations
aCC Anterior cingulate cortex
DMN Default Mode Network
EEG Electroencephalography
fMRI Functional Magnetic Resonance Imaging
IFG Inferior frontal gyrus
OFC Orbitofrontal cortex
pCC Posterior cingulate cortex
PET Positron Emission Tomography
PFC Prefrontal cortex
PMC Posteromedial cortex
qEEG Quantitative electroencephalography
rTPJ Right temporoparietal junction
TMS Transcranial Magnetic Stimulation
TPJ Temporoparietal junction
vmPFC Ventromedial prefrontal cortex
Organizational justice is important to workers. When they
feel fairly treated, employees tend to report less stress and
better health (Cropanzano and Wright 2011), as well as
more positive attitudes toward their jobs (Cohen-Charash
and Spector 2001). Employers also benefit through higher
worker job performance (Colquitt et al. 2001), more
organizational citizenship behaviors (Fassina et al. 2008),
and lower turnover intentions (Aryee and Chay 2001). In
this paper, we seek to describe the underlying psycholog-
ical processes—and supporting neural systems—that
employees use when evaluating whether an event was fair
or unfair. In particular, we focus on deontic justice, the
&Sebastiano Massaro
sebastiano.massaro@wbs.ac.uk
Russell S. Cropanzano
russell.cropanzano@colorado.edu
William J. Becker
w.becker@tcu.edu
1
Leeds School of Business, University of Colorado at Boulder,
Boulder, CO 80309, USA
2
Warwick Business School - Behavioural Science, University
of Warwick, Coventry CV4 7AL, UK
3
Department of Management, Texas Christian University,
TCU Box 298530, Ft. Worth, TX 76129, USA
123
J Bus Ethics
DOI 10.1007/s10551-016-3056-3
view that justice is of value for its own sake. Toward this end,
we present and discuss a theoretical model that disentangles
the psychological mechanisms involved in the formulation of
these fairness judgments and review these processes through
an analysis of relevant neural correlates. Overall, we argue
that individuals evaluate fairness by applying normative
criteria called justice rules (Cropanzano et al. 2015;Scott
et al. 2008). As we shall see, these fairness judgments are
moderated by both affective and cognitive processes. How-
ever, people are most likely to make the effort to apply jus-
tice rules, when they experience both cognitive empathy and
affective empathy toward another person.
To refine and extend our understanding of these complex
mechanisms, we will review the principal brain regions
involved in both justice rules and in these two types of
empathic processes. In order to accomplish these objectives,
we first examine the available research on organizational
justice, paying particular interest to the various reasons why
workers care about fairness. We then focus in more detail on
deontic justice, which is the principal concern of this paper.
Afterward, we discuss the three main components of our
model—application of justice rules, cognitive empathy, and
affective empathy. Throughout, we consider the implications
of our model for research and practice.
Organizational Justice
While research strongly suggests that justice matters to
employees, scholars have proposed multiple explanations
of why this is so. There has been some degree of conver-
gence on the idea that justice matters to employees for
multiple reasons (Cropanzano et al. 2001); yet, tradition-
ally, three main theoretical frameworks have been pro-
posed: The instrumental approach, the relational approach,
and the deontic approach (e.g., Colquitt and Greenberg
2001; Folger and Salvador 2008).
Overview of the Three Motives for Justice
The instrumental model, which is historically the oldest,
maintains that individuals prefer justice because it provides
them long-term control over valued outcomes (for discus-
sions, see Greenberg 1990; Tyler 1997,2006). As one
might expect, empirical evidence supports this view:
Employees are egocentrically biased to view decisions that
favor them as more fair (Cropanzano and Moliner 2013)
and their views on justice are positively related to outcome
favorability (e.g., Ambrose et al. 1991).
A subsequent view, the relational framework, refers to a
set of theories—the group-value model, the relational
model, and the group engagement model—that focus on
and emphasize the relationship between the individual and
his or her workgroup (Blader and Tyler 2015). Together,
these three models posit that justice, and especially pro-
cedural justice, is central because fair treatment signals that
an individual is respected and regarded within a significant
social group that he or she values (Lind and Tyler 1988).
This concept is consistent with, and supported by, empir-
ical tests within organizational settings (e.g., Tyler and
Blader 2000; Tyler et al. 1997).
While research strongly supports the existence of
instrumental and relational concerns, these do not appear to
be the only reasons why workers care about organizational
justice. A third approach, the deontic model of justice
(Folger and Glerum 2015; Folger and Salvador 2008),
argues that employees often maintain ethical standards or
moral principles (Blader and Tyler 2001), sometimes called
‘justice rules,’ that guide the moral treatment of others. As
of result of these normative criteria, they adopt a moral
duty (deon =duty) to uphold their principles (Folger 2001,
2011; Hannah et al. 2014). In this way, justice is valued for
its own sake, not simply because of the personal benefits
that it may bring to a person (Folger et al. 2005; for
additional empirical evidence, see O’Reilly et al. 2016;
Skarlicki and Rupp 2010; Skarlicki et al. 2008).
While deonance does not seem to be the only justice
motive (cf. Folger et al. 2013), it does appear to be a
notable one. For example, reactions to justice are partially
influenced by personality dispositions indicative of trait
morality (Colquitt et al. 2006). Likewise, research on ‘al-
truistic punishment’ (Fehr and Ga
¨chter 2002), shows that
individuals will sacrifice economic benefits in order to
punish someone who violates social norms (Fehr and
Fishbacher 2004; Fehr and Ga
¨chter 2002; Fehr et al. 2002).
Interestingly, the same act, even if it is equally hurtful, will
be punished more harshly if it violates a group norm and
less harshly if it does not (DaGloria and DeRidder 1977,
1979; DeRidder 1985).
Research among management scholars is consistent with
this notion (cf. O’Reilly and Aquino 2011). Turillo et al.
(2002) found that individuals will punish an unjust cow-
orker, even when the victim is a stranger (for similar
findings, see the studies reported by O’Reilly et al. 2016).
Likewise, managers who are high in moral identity are
more likely to punish transgressors than managers low on
this measure (Skarlicki and Rupp 2010), though this may
depend somewhat on the dimension of moral identity under
examination.
Similarly, when people feel that they have been harmed
by an immoral action, evidence suggests that they prefer a
resolution outcome that validates their normative beliefs
(Reb et al. 2006; for related findings, see Skarlicki et al.
2008). Evidence of this kind suggests that deontic justice
provides a critical account of why employees care about
justice: They do so, in part, because they possess moral
R. S. Cropanzano et al.
123
standards and duty to uphold them (Folger and Glerum
2015; Folger and Salvador 2008; Hannah et al. 2014).
Despite evidence that organizational justice can be
motivated by something other than self-interest, this notion
has been met with some skepticism (e.g., Greenberg 2001;
Colquitt and Greenberg 2001). In a sort of conceptual ‘path
dependence,’ these concerns appear to be rooted in the
history of ethics scholarships. Hatfield and colleagues
(Hatfield et al. 1978, pp. 128–129) maintained that ‘the
majority of scientists […] interpret apparent altruism in
cost-benefit terms, assuming that individuals […] perform
those acts that are rewarded […] and […] avoid those acts
that are not. Either self-congratulation or external reward,
then, must support apparently altruistic behavior.’ Like-
wise, Gillespie and Greenberg (2005, p. 205) assert that
‘the only ultimate goal(s) of individuals [is/]are self-di-
rected’ (italics added). On this view, the concern is not
limited to justice. Rather, people appear to be incapable of
motivated behavior that is not self-interested (for a review
and critique, see Cropanzano et al. 2007).
Deontic Justice and the Question of ‘How?’
A close look at the literature suggests that scholars have yet
to fully investigate and explain the intrapersonal mecha-
nisms responsible for deontic justice. Fortunately, scientists
from several disciplines have begun to focus attention on
the underpinnings of human justice. In particular, cognitive
neuroscience has recently offered remarkable insights on
the neural basis of moral behaviors (e.g., Greene et al.
2001; Moll et al. 2005); at the same time, business ethics
has begun to explore the value of neuroscience methods
and findings to advance theory on organizational justice
(Beugre
´2009; Dulebohn et al. 2009; Massaro and Becker
2015; Robertson et al. 2007; Salvador and Folger 2009).
Despite such vibrant and compelling interest, relatively
little is clearly known regarding the processes by which
one’s brain can be ‘recruited’ for another person’s justice.
We call this the ‘how?’ question—How can deontic justice
be realized in an individual’s brain? And what are the
related implications for business ethics?
Deontic Justice and Organizational Neuroscience:
An Overview
We shed light on these matters by integrating a set of psy-
chological processes, and supporting neural systems, into a
unique theoretical framework that aims to advance our current
understanding of deontic justice. In other words, we aim to
elucidate core mechanisms by which a worker comes to care
about the moral principles that have been applied (or violated)
for another individual through the lens of organizational
neuroscience (Becker et al. 2011). Thus, our contribution aims
to go beyond an ordinary exploration of what we might learn
individually from the dedicated organizational, neuroscience,
or psychology literatures. Rather, we aim to cross-fertilize
among these disciplines in order to create an integrated model
of deontic justice for business ethics.
In this paper, we explicate and defend a single proposi-
tion: Deontic justice, and in particular the ability to take into
account the ethical quality with which others are treated, is
grounded on a set of three largely integrated processes: The
use of justice rules to assess events, cognitive empathy, and
affective empathy. We acknowledge that this is a bold
proposition, not yet commonly held in the business disci-
plines. For this reason, we devote considerable attention to
supporting our foremost argument—that workers take jus-
tice seriously even when they are not selfishly benefited—by
introducing neuroscience evidence. As we shall see, our
brain is a complex organ consisting of many structures
working synchronously and simultaneously to produce
thoughts, feelings, decisions, and actions, including the
ability to morally relate to others. As such, there is not a
single brain center for justice, rather, justice is a whole-brain
affair that relies on the integration of cognitive and affective
neural systems and psychological processes (Casebeer
2003; Tancredi 2005; Yoder and Decety 2014).
Overview of the Present Model
Our primary aim is to provide a much needed, but largely
neglected, theoretical rationale for how a set of synergistic
processes provide an explanatory account for deontic jus-
tice. Our reasoning is substantiated by merging evidence
from several literatures. The resulting theoretical model
depicted in Fig. 1underscores the conceptual challenges.
This figure is only intended to provide a cursory sum-
mary and guide to our model of deontic justice. We have
not yet provided an account of specific neural systems
supporting our model. Rather, here we describe how indi-
viduals make justice judgments (Cropanzano et al. 2015).
Notice that people are often motivated to make sense out of
a salient but harmful event. Moreover, the degree of
empathy frequently impacts how these events are evaluated
(Scott et al. 2008). As mentioned above, a key insight of
deontic justice theory is that an unfortunate or harmful
event is evaluated with respect to some ‘normative criteria’
(Cuguero
´-Escofet and Fortin 2014,p.2)or‘justice rules’
(Hollensbe et al. 2008, p. 1099) or ‘moral intuitions’
(Greene and Haidt 2002, p. 517). When a transgressor’s
behavior toward another violates these rules, the observer
or witness believes that the victim has been treated unfairly
(Cropanzano et al. 2015; Rupp and Paddock 2010). As we
shall see, justice rules are emotionally weighted and they
can be distinguished from simple social conventions, which
Deontic Justice and Organizational Neuroscience
123
are instead situational and somewhat arbitrary (e.g., Sme-
tana et al. 1993).
Yet, a key challenge for deontic justice theory lies in
explicating the extent to which a justice rule will be applied
and the extent to which it will not. All events in business
organizations are not necessarily evaluated with respect to
moral principles (Folger and Cropanzano 1998,2001). For
example, it is not uncommon for people to ignore
mistreatment received by others, especially when individ-
uals are part of different social groups (Greene 2013). This
allows for callous disregard in relation to ‘out group’
individuals (Hein et al. 2010). Thus, if businesspeople
could increase their circle of regard, more people would
receive fair treatment (cf., Clayton and Opotow 2003).
By introducing neural systems associated with these
psychological processes, we advance knowledge of how a
worker’s disinterest in the needs of others may result from
what we could call a ‘breakdown’ in his or her empathic
system (i.e., the capacity to understand what another person
is experiencing from within the other person’s frame of
reference) (Greene 2013). Supported by this and other
evidence, we argue that justice rules in the workplace are
more likely to be applied when the third-party observer or
witness has a sense of empathy for the victim (e.g., Blader
and Tyler 2001; Patient and Skarlicki 2005). In particular,
we suggest that two interrelated types of empathy are rel-
evant: Cognitive empathy and affective empathy.
Cognitive empathy involves knowing the contents of
other people’s feelings through deliberate thought. That is,
one must understand what others are thinking and feeling in
order to know when they are distressed (for an overview on
empathy, see e.g., Walter 2012). However, ‘knowing’ per se
is insufficient. When a worker observes a coworker in dis-
tress, he or she may experience affective empathy without
any deliberate thought or intention. Affective empathy
involves automatic sharing in the emotional experiences of
others (e.g., Walter 2012). Cognitive and affective empathy
reinforce one another (Zaki and Ochsner 2012).
Closing Thought and Looking Ahead
In the sections that follow we further unravel the model
presented in Fig. 1. We begin with a review of the evidence
related to justice rules in the next section. Subsequently, we
turn our attention to the two forms of empathy, beginning
with cognitive and moving later to affective. While we
treat cognitive and affective empathy separately in order to
facilitate our discussion of the brain regions reviewed, we
emphasize that both these and the related psychological
processes are highly interconnected.
Recognizing and Applying Justice Rules
Research on deontic justice suggests that individuals make
their moral decisions by observing or becoming aware of a
triggering event—e.g., a potential mistreatment by a work
supervisor, and evaluating this event with respect to some
justice rule. A ‘justice rule’ is a ‘self-based standard, or
expectation, derived from individuals’ socialized or inter-
nalized values, regarding the moral obligations of indi-
viduals in a specific context’ (Lau and Wong 2009, p. 281).
This rule is used as a sort of measuring device to evaluate
the moral appropriateness of the event (Scott et al. 2008).
Because this type of assessment is based on the use of
moral principles, it has also been called ‘principlism’ (e.g.,
Batson 1999, p. 303; Blader and Tyler 2001, p. 235). These
judgments are not ‘context free,’ and various elements of
Fig. 1 Theoretical model of
deontic justice. Notice the three
critical elements: Application of
justice rules, cognitive empathy,
and affective empathy. Note
also that empathy has both a
moderating and a main effect
R. S. Cropanzano et al.
123
the decision environment can impact the evaluations (for
empirical evidence, see Nicklin et al. 2011; for a review,
see Cropanzano and Moliner 2013). While context effects
are important, they are somewhat beyond the scope of the
present article, which aims to understand the underlying
intrapersonal processes of deontic justice.
Historically, justice rules have been organized into three
families (Cropanzano et al. 2015): Distributive justice,
which pertains to the outcome allocation; procedural jus-
tice, which relates to the decision-making process; and
interactional justice, which concerns the interpersonal
treatment received from another person. Some researchers
have found it useful to further divide interactional justice
into two sub-dimensions (e.g., Colquitt et al. 2001). In this
approach, interpersonal justice relates to the dignity and
respect that one receives, whereas informational justice
pertains to keeping people informed, providing explana-
tions, and so forth. Each of these types of justice has its
own set of justice rules (cf. Colquitt and Rodell 2015).
While research is still ongoing (e.g., Hollensbe et al.
2008), we consider it a useful taxonomy (Colquitt 2001)to
understand what ‘justice rule’ means. According to this
categorization, there are at least three just ways to dis-
tribute outcomes. These are equity (to each according to
contributions), equality (to each the same), and need
(Deutsch 1975,1985). Likewise, a just procedure should be
bias-free, consistently applied, accurate, correctable, rep-
resentative of all, and ethical (Leventhal 1980). Colquitt
(2001) adds that interpersonally just treatment is polite,
dignified, respectful, and contains no inappropriate
remarks, while informationally just communication is
candid, thorough, timely, and tailored to individual needs.
While future research may yield additional standards,
Colquitt’s (2001) work provides a good sense of what is
intended by a ‘justice rule.’ Overall, within deontic justice
theory, justice rules are essentially seen as a type of moral
norm (Folger 2001,2011; Folger and Salvador 2008). As
with other moral norms, these do not depend on the opinion
of an authority figure (i.e., tend not to change due to third-
party norm enforcement), and their violations warrant
punishment (Smetana 1981,1984,1985,1989).
Development of Moral Norms
Social and developmental psychology research has offered
important insights to further understand this concept by
suggesting that human beings recognize and apply moral
standards from a very early age. For instance, in a study
conducted by Hamlin and Wynn (2011), toddlers watched a
puppet show in which the characters behaved either helpfully
or unhelpfully toward other puppets. Children, as young as
5 months, preferred the kindly puppets to the disobliging
ones (for a review, see Bloom 2013;Hamlinetal.2010).
These childhood preferences reflect an ‘innate’ distinc-
tion that persists into adulthood—that between moral
norms and social conventions (for a discussion on innate-
ness and morality, see Haidt and Joseph 2007; Suhler and
Churchland 2011). A social convention is a rule of
behavior that, while making community life potentially
more efficient, is not seen as correct for its own sake
(Nucci and Nucci 1982). To at least some degree, children
as young as three or four can distinguish social conventions
(e.g., eating with fingers, standing during nap-time) from
moral transgressions (e.g., pushing, stealing, hitting), and
by age five they are quite good at telling the two apart
(Smetana et al. 1993).
Moral Norms and Neuroscience
We will now turn our attention to neuroscience evidence
that elucidates how our brain applies moral norms. We
begin with a focus on the prefrontal cortex (PFC) and
review studies of moral decision-making that highlight its
involvement in two different types of moral violations. We
then cover additional brain regions, such as the insulae,
which are involved in emotional responses and in the
application of justice rules.
The Trolley Problem, the Prefrontal Cortex, and Two
Types of Moral Violations
Greene et al. (2001) conducted a functional Magnetic
Resonance Imaging (fMRI) study inspired by moral
dilemmas that are often used in ethics research. One of the
most remarkable dilemmas that influenced this research is
the trolley problem (Thomson 1986). Usually, researchers
present participants a scenario showing a runaway trolley
with five people on it, which are tied and unable to move. If
the trolley were to proceed on its path it would kill its
occupants. The only way to save them is to activate a
control that would switch the trolley’s path. When people
are asked whether they are ought to press the button to save
five people at the expense of one, most say yes. Yet, when
people are posed with a different scenario their decisions
are likely to be opposite. In this case, the participants are
told that they are standing next to a stranger on a bridge
obstructing the trolley’s path. The only way to save the five
people is to push this stranger off the bridge causing his or
her death. Although the ultimate result, from a rational
perspective, is the same in both scenarios, people are less
willing to save the five others by pushing the stranger.
Research on the trolley problem suggests that business
ethics should attend more closely to a distinction between
two different types of moral breaches—personal and
impersonal. According to Greene and Haidt (2002), vio-
lations of moral norms are especially salient when the
Deontic Justice and Organizational Neuroscience
123
misbehavior is personal. These are behaviors that do rel-
atively serious harm to a specific individual and that do so
through the direct agency of the transgressor. In contrast,
impersonal violations lack the aforementioned criteria.
Moreover, Greene and colleagues (Greene et al. 2001,
2004) showed that personal transgressions are processed in
areas of the brain pertaining to emotions, while those that
are impersonal are processed in brain regions that pertain
more strongly to cognition. Specifically, brain regions
associated to emotions, like medial frontal gyrus, posterior
cingulate gyrus, and angular gyrus, were more activated in
the personal condition rather than in impersonal and even
non-moral scenarios; parietal lobes, among other regions
associated to cognition, were instead significantly less
active in the personal condition than in the other paradigms
(Greene et al. 2001).
Interestingly, Greene et al.’s (2001) findings closely
parallel long standing observations by organizational jus-
tice researchers that interpersonal justice violations tend to
generate greater outrage than do other types of injustices
(Folger and Cropanzano 1998; Skarlicki and Folger 2004).
Individuals with strong moral identities are especially
likely to retaliate when observing interpersonal injustice
(Skarlicki and Rupp 2010; O’Reilly et al. 2016). Likewise,
interpersonal injustice, as opposed to other types, is more
strongly related to workplace deviance (Colquitt et al.
2001; Judge et al. 2006). Research based on the trolley
problem suggests that the importance of interpersonal
justice may be ‘hard wired’ into our brain functioning.
This research (Greene et al. 2001) also revealed that
reaction times were longer when participants judged per-
sonal violations as morally appropriate, as compared to
when subjects judged them to be wrong, suggesting a key
role for ‘executive control’ brain regions. Thus, in a fol-
low-up study, Greene et al. (2004) focused on brain activity
in the subjects’ prefrontal cortex (PFC), while deliberating
over similar scenarios. The PFC is the anterior part of the
frontal lobes of the brain—an area crucial for integrative,
executive, and goal-directed functions (Fuster 2001).
Research has often focused on the subdivisions of the PFC,
in particular the ventromedial or vmPFC (Damasio 1996),
orbitofrontal (Rolls 1996), and dorsolateral (Goldman-Ra-
kic 1987) (for a more extensive review, see Christoff and
Gabrieli 2000). While these regions have a vast network of
connections and intertwined functions, speaking generally,
the dorsolateral PFC is more critical for cognitive control
(e.g., attention), while the orbitofrontal and vmPFC are
largely associated to emotional processes and affective
decision-making (Damasio et al. 1990; Gray et al. 2002;
Rolls and Grabenhorst 2008).
In the trolley problem just described (Greene et al.
2001), researchers found increased ventromedial and de-
creased dorsolateral PFC activity in response to personal
(i.e., bridge scenario) as opposed to impersonal (i.e.,
switching the trolley’s path) moral choices; the difference
between these two situations also related to the salience of
the victim. Moreover, in Greene et al. (2004) the dorso-
lateral PFC showed increased activity for difficult, as
compared to easy personal dilemmas. Further, Luo et al.
(2006) showed increased vmPFC activity in response to
more critical moral transgressions compared to less sev-
ere (see, Blair 2007). What is more, lesions to the PFC
create impairments for these types of moral decisions,
supporting its essential role in applying moral rules
(Ciaramelli et al. 2007; Koenigs et al. 2007).
Additional Brain Regions Involved and Justice Rules
Moll et al. (2002) provided additional insights into the
relationship between human moral rules and emotions. They
asked subjects to read short statements and judge them as
being either right or wrong. In this way Moll and his col-
leagues forced participants to take a stand irrespectively of
the content of the sentences. Some of the statements
described either emotional situations provoking moral
responses, others were emotionally negative scenarios with
no moral substance, and others were non-emotional situa-
tions. These researchers found that moral judgments asso-
ciated with negative emotions prompted activation in the
antero-medial orbitofrontal cortex (OFC)—whose activation
correlates with subjective emotional experiences. In support
of this finding, injury to this part of the OFC impairs emo-
tional behavior (for a review, see Rolls and Grabenhorst
2008). Non-moral judgments associated with unpleasant
emotions induced activation of the lateral OFC and the
amygdalae. The amygdalae have traditionally been associ-
ated with processing emotionally arousing stimuli and with
sharing others’ emotions (Adolphs et al. 1994). In the study
of Moll and colleagues, somewhat surprisingly, the amyg-
dala did not show increased activation in the moral judgment
condition. The researchers suggested that this is because the
medial OFC ‘controls’ the amygdala’s activity (see
also Baxter et al. 2000). Thus, the OFC could be critical for
integrating our moral knowledge with emotions that rein-
force moral actions.
Another brain region particularly salient to ‘justice
rules’ is the anterior part of the insulae. This region is
highly engaged in our emotional life, in particular when we
experience disgust (Krolak-Salmon et al. 2003) and when
we observe other people feeling disgust (Wicker et al.
2003). This basic emotion, which seems to have benefited
our ancestors by helping them avoid unhealthy experi-
ences, might have been co-opted by evolution to serve a
moral purpose (Haidt 2003,2006). In fact, it is common for
people to respond to certain moral violations, especially
those involving a perception of impure contamination, with
R. S. Cropanzano et al.
123
a sense of disgust (e.g., Schnall et al. 2008,2009; Skarlicki
et al. 2013). Given such indication, one would expect
involvement of the insulae when people make moral
assessments on unfair events. Indeed, this seems to occur.
Sanfey et al. (2003) offered support to this claim by
employing the Ultimatum Game in an fMRI study. In this
game, a proposer gets an amount of money by the researchers
and suggests how to divide the sum with another player. The
receiver chooses to either accept or reject the proposal. If the
second player accepts, then the money is divided accord-
ingly. Otherwise, neither party gets any money. According to
a rational choice perspective (e.g., Tversky and Kahneman
1986), the second player should accept any offer given, since
this would always provide more money than what they
originally had; however, in practice, divisions seen as unfair
are often rejected. In Sanfey and colleagues’ experiment
(2003), those offers judged as unfair elicited activity both in
participants’ brain ‘cognitive’ areas (e.g., the dorsolateral
PFC), as well as in the ‘emotional’ areas, such as the anterior
insulae (and in particular the right insula). Of interest here,
this region showed higher activity on rejected unfair offers
compared with fair offers, thereby suggesting, once again, a
key role for emotional involvement in judgements of what is
fair and what is not.
Finally, neuroscience research focusing on empathy and
on Theory of Mind has also investigated how people make
judgments about others’ actions (see e.g., Van Overwalle
and Baetens 2009). For one, Singer and colleagues (2006)
investigated how brain empathic responses are modulated
by the affective link between individuals. These research-
ers measured brain activity of people observing confeder-
ates receiving pain when the accomplices had participated
in a previous fairness game. Singer et al. (2006) found that
subjects showed activation in fronto-insular and anterior
cingulate cortices toward fair players who were allegedly
being hurt. However, these responses were significantly
reduced when people observed an unfair confederate
receiving pain. In contrast, for unfair accomplishes the
participants showed an increased activation in brain
areas associated to reward. This evidence resonates with
the idea of altruistic punishment, which, as we shall see
later in more detail, is consistent with our theoretical model
of deontic justice.
Conclusions and Research Needs
Overall, the studies described above provide a good
example, not only of how recent neuroscience research has
investigated neural correlates of justice paradigms, but also
of how these insights advance business ethics by revealing
that neural systems involved in both emotional and cog-
nitive processes are relevant in the appraisal of moral rules.
Moreover, this evidence suggests that ethical behavior
involves an alignment between cognition and affect.
Interestingly, the observation that ethics involves the right
thoughts for the right reasons has long been observed by
virtue ethicists (e.g., Annas 2011). This suggests that affect
should play a larger role in business ethics, an idea that is
consistent with theories of deontic justice (Folger et al.
2005).
Business ethicists should also take into account the
implications of the trolley problem because there are dif-
ferent patterns of neural activation sustaining two distinct
types of moral breaches—personal and impersonal (Greene
et al. 2001). This has implications for a number of ethical
problems. For example, consider the problem of white
collar crime. Though white collar crime is far more costly
to society than so-called ‘street crime,’ citizens persist in
viewing white collar offenses as less problematic (Frie-
drichs 2010). In particular, the impersonal nature of white
collar corruption may make it seem less troublesome than
face-to-face misconduct. Subsequently, this could pose a
challenge for ethical training and even law enforcement.
Closing Thoughts on Justice Rules
Organizational and psychological research suggests that
justice rules are a type of moral norms: They are often
deeply held universal standards, which emerge early in life
(Smetana 1985,1989). However, consideration of the
experiments exploring the neural substrates of moral rules
allows us to disentangle their meaning because it suggests
that justice rules involve key brain regions that are relevant
both to discriminating between personal and impersonal
scenarios and to cognitive and affective experiences
(Greene 2013). The latter evidence is particularly impor-
tant for deontic justice theory for another reason relevant to
the organizational life. Indeed, while justice rules can be
seen as internalized standards for assessing the fairness of
events, workers (and humans more generally) often fail to
make use of them (Folger and Cropanzano 1998; Cropan-
zano et al. 2015). It is important to understand why this is
the case.
As we shall see, we argue that this relationship is
moderated by an individual’s empathy for others. When
people feel empathy for others, they are more likely to treat
them justly and respond to their unjust treatment by others
(Blader and Tyler 2001; Hoffman 1994; Lerner and
Goldberg 1999). If workers are not able to appreciate,
internalize, and share others’ feelings, it becomes difficult
to judge whether they are being treated justly and thereby
to act accordingly. What is more, empathy relies on a
dissociable cognitive and emotional neural system, which
can moderate and also directly promote deontic justice.
Deontic Justice and Organizational Neuroscience
123
Deontic Justice as an Empathic Process
Empathy, as the term is used here, refers to the set of pro-
cesses that allows a person to share the psychological
experience, both thoughts and feelings, of another individual
(Batson 1999; Hoffman 1994). A rich body of research
suggests that, when individuals feel empathy for others, they
become more altruistic and cooperative toward them (e.g.,
Batson 2006; Batson and Ahmad 2001). Hoffman (2000,
p. 3) has this sort of evidence in mind when he remarks that
empathy is ‘the spark of human concern for others, the glue
that makes social life possible.’ Additional evidence sup-
porting the role of empathy in human morality has been
explored by social psychologists (Batson 2009; Eisenberg
and Fabes 1990; Tyler et al. 1997;Wispe
´1986), philoso-
phers (Churchland 2011; D’Arms 1998), and natural scien-
tists (Decety and Lamm 2006;deWaal2008; Preston and de
Waal 2002). Likewise, Aderman et al. (1974) found that
individuals are less prone to blame victims for their own
moral adversity when they empathize with them.
While this work is impressive, research suggesting that
empathy makes us behave more justly toward other people
is even more relevant to business ethics and deontic justice
theory. Patient and Skarlicki (2010) investigated the impact
of empathy on justice judgments in the workplace, arguing
that workers care more about justice when they share the
other person’s emotions. In their first experiment, managers
completed a role-playing scenario taking the perspective of
someone who was downsizing an employee. Those who
scored higher in empathy reported that they would behave
more justly than did those who scored lower. In their
second study with undergraduate participants, the authors
also found that empathy induction increased the justice
with which subjects behaved.
Empathy: Some General Remarks
In psychological research, the term empathy has been
defined in various ways, some of which are not entirely
consistent with one another (Batson 1995). Some scholars,
for example, appear to treat empathy as a specific emo-
tional state, which involves mindfulness of, and respon-
siveness to, another individual’s concerns (cf., Bagozzi and
Moore 1994; Batson et al. 1995). However, it is more
common to view empathy as a ‘vicarious emotion,’ which
is associated with particular types of motivated behavior
(Batson et al. 1987, p. 19). For example, people who feel
empathy are more likely to assist a distressed individual,
even if they are provided with an opportunity to exit the
situation (Batson 1995). Other definitions of empathy are
somewhat more rational, referring to understanding another
person’s thoughts and feelings, rather than the sharing of
their affect (Cohen and Strayer 1996).
These are important conceptual matters, but it is beyond
our scope to thoroughly discuss all of empathy’s definitions
here (for a comprehensive review, see Batson 2009). For
the purposes of our model, and without gainsaying other
approaches, we are primarily concerned with empathy as
the sharing of other people’s feelings (Batson 1995),
including the comprehension of their psychological con-
tents (Cohen and Strayer 1996). In this way, we also
incorporate the recognized distinction between cognitive
empathy and affective empathy (e.g., Ang and Goh 2010;
Hogan 1969). The former refers to understanding the
contents of another person’s thinking and feeling (Pe-
cukonis 1990). The latter refers to sharing the affective
experiences of another individual (Hoffman 1994,2000).
Consistent with this distinction, the neuroscience liter-
ature provides further evidence for disentangling these two
types of empathy (e.g., Shamay-Tsoory 2011; Walter 2012;
Zaki and Ochsner 2012). According to recent research with
imaging and lesion studies (i.e., studies that enable the
association of impaired brain areas to specific functions;
Massaro 2015), there are two dissociable neural systems
for empathy: One cognitive and the other emotional
(Shamay-Tsoory et al. 2009). The two are highly related
and both are important. A fully empathic experience
involves (at least) components of affective sharing, cog-
nitive self-awareness, and self-other distinction (Baron-
Cohen and Wheelwright 2004; Blair 2005).
Remarkably, business ethics has not generally made this
distinction explicit. For example, references to Batson’s
(2009) definition of empathy appear to be more closely
associated with cognitive empathy; Patient and Skarlicki’s
work (2010) instead investigates affective empathy. To
address this important concern for deontic justice theory,
here we refer to cognitive empathy as the deliberate psy-
chological process of recognizing and understanding
another person’s thoughts and feelings. On the other side,
affective empathy captures the similarity of feelings
between one person and another, the so-called ‘experience
sharing,’ which is more automatic and non-reflective
(Walter 2012; Zaki and Ochsner 2012). We argue here that
both types of empathy are critical for deontic justice.
Cognitive Empathy: Understanding the Victim’s
Psychological Experience
As discussed earlier, deontic justice hinges on the way a
person has been disadvantaged due to a violation of a social
norm (Folger and Cropanzano 1998,2001). In many situ-
ations, we gain such information through our evaluation of
what the victim is thinking and feeling. Cognitive empathy
provides the mechanism by which we evaluate another
person’s psychological point of view (Frith and Singer
2008). Notice that the adjective ‘cognitive’ may be a bit
R. S. Cropanzano et al.
123
misleading here, since it refers to how the observer under-
stands the contents of another individual’s mind and feel-
ings, not the accuracy of that understanding. Accordingly,
our representation of other persons’ cognitions and emotions
allow us to make inferences regarding their reactions to
events. Relative to deontic justice, if we witness another
person who is angry or frustrated by unfair treatment, cog-
nitive empathy provides our appraisal of their feelings and
the appropriateness of those feelings. However, it does not
extend to sharing those feelings: Cognitive empathy, per se,
maintains a sort of emotional distance between the victim
and the third-party observer (e.g., Walter 2012).
As we shall show, current neuroscience research has begun
to reveal key brain regions involved in cognitive empathy: the
temporoparietal junction (TPJ), the posteromedial cortex
(PMC), the prefrontal cortex (PFC), and the cingulate cortex.
Next, we review evidence for each of these regions.
Temporoparietal Junction (TPJ)
The TPJ is that area of the brain where the parietal lobe
meets the temporal lobe. Several studies indicate that the
right TPJ (rTPJ) in particular is involved in our represen-
tations of other people’s cognitions and meta-cognition
(Decety and Lamm 2007; Saxe 2006) and that the
responses in this area peak just at the time when someone’s
thoughts are described (Saxe et al. 2009; Young et al.
2007). For instance, Saxe and Wexler (2005) asked
research participants to consider their feelings when eval-
uating two different types of information. First, they con-
sidered socially relevant information, which was presented
as a scenario involving another person. Second, they con-
sidered a description of what that person ‘wanted’ or ‘be-
lieved.’ Their results showed that the rTPJ response was
low when subjects were reading descriptions of the social
background and rose when the psychological state of the
protagonist was described. Moreover, the response in the
rTPJ was higher when the protagonist’s background and
the psychological state were incongruent, as compared to
when they were consistent (Saxe and Wexler 2005).
While rTPJ activation seems to be aligned with the
cognitive empathic idea that ‘people’s feelings have to be
predicted from their own subjective desires’ (Terwogt and
Rieffe 2003, p. 74), recent research indicates that fairness
is also strongly related to activation of the rTPJ. Specifi-
cally, the hemodynamic response (i.e., a parameter
employed to measure brain activation in fMRI research) in
rTPJ shows a differentiation between morally good and bad
actions before such response arises in other regions, such
as the dorsolateral PFC (Yoder and Decety 2014). This
supports the overall insight that rTPJ likely plays an
anticipatory role in the cognitive empathic processes
involved in deontic justice.
Posteromedial Cortex (PMC)
Another relevant brain area involved in cognitive empathy
is the posteromedial cortex. This is an architectonically
discrete region, which has been often understudied because
of its anatomical location (Cavanna and Trimble 2006).
Interestingly, this complex area has been just recently
identified as the most active brain region during the so-
called ‘resting state’ (Cauda et al. 2010)—that state when
brain activity is measured in the absence of a task or
experimental stimuli (Mastrovito 2013). Positron Emission
Tomography (PET) studies, which are able to couple
functional analyses with metabolic ones (for a method-
ological overview, see Massaro 2015), showed that PMC
consumes about 40 % more glucose than the hemispheric
mean, providing support for this view (Raichle et al. 2001).
While the resting state typically represented a control
condition in early fMRI experiments, this view changed
drastically with the discovery of functionally relevant infor-
mation from resting-state activity (Biswal et al. 1995).
Specifically, the Default Mode Network (DMN), a network of
regions active during resting state has been identified using an
array of recording techniques (Shulman et al. 1997;Mazoyer
et al. 2001; Raichle et al. 2001). The DMN includes areas of
the PMC, the inferior parietal cortices, as well as the dorsal
and ventral areas of the medial frontal cortex (Uddin et al.
2009). A core characteristic of the DMN is that it consistently
exhibits increased activity at rest, and decreased activity
during task performance. The reliability of this observation,
together with those on the brain’s metabolism, suggest that
deactivation may be a way for the brain to sustain self-ori-
ented psychological processes (Fransson 2005).
Of particular interest for our aims is the flexibility of the
DMN. Indeed, the literature has revealed that tasks that
activate the DMN share core processes, but differ across
content and goals. For instance, Greene et al. (2001)
observed that certain forms of moral judgment—the personal
cognitive moral dilemmas—activate brain regions involved
in the default network. Hence, deciphering moral dilemmas
seems to be a situation where people cognitively empathize
by a psychological understanding of events occurring to
others (for related discussion, see Moll et al. 2005). The
DMN as a whole likely plays an important role in forming a
cognitive judgment of how fairly others are treated.
Prefrontal Cortex (PFC)
As discussed earlier, the prefrontal cortex is important for
justice. Because it provides executive and goal-directed
functions (Fuster 2001), this region is also be heavily
involved in cognitive empathy (Shamay-Tsoory et al. 2003).
Both the dorsolateral and ventromedial regions of the PFC
have been related to morally just behavior (Carrington and
Deontic Justice and Organizational Neuroscience
123
Bailey 2009;Molletal.2005). vmPFC is particularly rele-
vant to empathic processes according to evidence from
neuroscience experiments involving pathological subjects.
Shamay-Tsoory and Aharon-Peretz (2007) studied individ-
uals with lesions in the ventromedial PFC and found that the
ability to cognitively represent other people’s emotions was
impaired. Yet, brain damage did not have the same disrup-
tive effects when an individual was thinking about another
person’s cognitions. Speaking more generally, damage to the
vmPFC tends to impair decision-making by disrupting
emotional processing (Bechara et al. 1997,2000). These
neurological subjects cognitively understand the situations
they face, but lose the ‘affective signal’ that helps healthy
brains to make ‘good’ choices (Damasio et al. 1990). These
findings suggest that the vmPFC mediates the process of
affective, though not cognitive, empathy.
Moreover, Greene et al. (2004) showed engagement of
the dorsolateral PFC in moral cognitive control, and
damage to the orbitofrontal PFC has been associated with
misinterpretation of social situations and socially inappro-
priate behavior (Rolls 1996), supporting the PFC’s over-
arching empathic role in deontic justice evaluations. This
feature of the PFC is supported further by a recent study
investigating brain activity during empathy for social
exclusion, which showed how individuals who have more
empathy for others experiencing negative social treatment
will make greater efforts to help and support the victims in
these situations (Masten et al. 2011).
Cingulate Cortex or Cingulum
Another important region for cognitive empathy is the
cingulate cortex. This area integrates inputs from different
sources and influences activity in other brain regions by
modulating motor, endocrine, and visceral responses (Bush
et al. 2000). It is subdivided into three regions: anterior,
posterior, and medial. While, the anterior cingulate cortex
(aCC), a large region around the rostrum of the corpus
callosum, is generally involved in emotional awareness
(Devinsky et al. 1995), a recent meta-analysis of neu-
roimaging research on empathy found that the left dorsal
anterior mid-cingulate cortex is specifically pertinent to
cognitive empathy (Fan et al. 2011). Similarly, the poste-
rior cingulate cortex (pCC) appears to be engaged when
individuals infer others’ feelings. For example, Maddock
et al. (2003) found in an fMRI study that the pCC showed
increased activation when research participants considered
words (hence, understood) related to emotion.
A Complex Outlook
While the regions described above are ‘key players’ in
cognitive empathy, other brain areas appear to be also
relevant. For instance, the superior temporal sulcus has
shown activation peaks in tasks evaluating our under-
standing of the intentions and goals of other people’s
actions (Lee et al. 2014; Pelphrey et al. 2004). People are
concerned with intentionality attributions (Lyons et al.
2006) when assigning moral blame to others. Individuals
are less likely to be held responsible for a potential injus-
tice if their alleged transgression was made with no
intention to harm (Hewitt 1975; Karniol 1978; Miller and
McCann 1979; Umphress et al. Umphress et al. 2013).
Thus, it is important to note that other brain areas and
psychological processes may be crucial for our overall
deontic justice experience, including outcome prediction,
associative learning, and flexible evaluation of contingen-
cies (Moll and de Oliveira-Souza 2007; Rolls 1996).
However, while the interaction of these complex cognitive
abilities with justice is surely a novel and fascinating area
of research, evidence to support a more multifaceted
framework in this respect is just beginning to appear (Moll
and de Oliveira-Souza 2007), thus inevitably falling
beyond the scope of the current work.
Affective Empathy: Sharing the Victim’s
Psychological Experience
While cognitive empathy refers to the understanding of
other people’s responses to injustice, this does not neces-
sarily mean that we internalize these emotions in our moral
behavior. In order to truly partake in another person’s plight,
it is therefore essential for us to share their feelings.
Affective or emotional empathy promotes such response
(Davis 1994) and is often believed to be a largely involun-
tary, vicarious response to affective cues from another per-
son (Decety and Jackson 2006; Hoffman 1994). As such, it
is an affective state, elicited by the emotive non-verbal cues
of the other(s), oriented toward such person(s), and similar
(or isomorphic) to his or her state (Walter 2012).
1
Moreover,
it includes some sort of meta-knowledge about both the self
and the other. As explained below, four brain areas appear to
be particularly relevant to affective empathy in relation to
deontic justice: the anterior insulae, the amygdalae, the
somatosensory cortices, and the inferior frontal gyrus (IFG).
1
Interestingly, scholars have debated whether or not affective
empathy involves emotional contagion. Some researchers have
argued that emotional contagion is a distinct construct because it
indicates the lack of awareness as to whether the source of the
experienced state is the self or another person (e.g., Fan et al. 2011;
Walter 2012). Other scholars have instead supported the view that
affective empathy holds characteristics similar to those of emotional
contagion (e.g., Zaki and Ochsner 2012). Such debates may result
from the different definitions that researchers give to empathy and its
forms (Batson 2009). We cannot resolve this issue here, but it
illustrates the sort of interesting research questions still remaining.
R. S. Cropanzano et al.
123
Anterior Insula
In each brain hemisphere, the insula is located at the interface
of the frontal, temporal, and parietal lobes; it is densely con-
nected with several regions including the dorsolateral PFC,
amygdala, and cingulate cortex (Augustine 1996;Mesulam
and Mufson 1982). Neuroscience research has widely shown
that several emotions—including anger, disgust, fear, sadness,
and also happiness, a positive emotion (Phan et al. 2002)—are
associated with insular cortex’s activation, supporting the
understanding that this region, and in particular the right
insula, has a key role in affective empathy (Bernhardt and
Singer 2012; Fan et al. 2011; Singer 2006).
Moreover, connectivity data support the idea that the
insula plays an important integrative role in affect: Patterns
of connectivity in resting-state functional neuroimaging
studies suggest a key function of its anterior part in com-
bining interoceptive and affective information (Critchley
et al. 2004). These models propose that the anterior insula
enables a subjective affective experience and global
‘feeling state’ (Cauda et al. 2011; Craig 2009). This is
consistent with studies showing its role in sensitivity to
moral justice, such as social exclusions (Masten et al. 2011;
Robertson et al. 2007).
Amygdala
In each brain hemisphere, the amygdala is located near the
temporal pole and has traditionally been associated with
fear, among other emotions (Fanselow and Gale 2003;
Phillips and LeDoux 1992). Fear, in our case, is relevant in
relation to normative conformity because the fear of retri-
bution can promote compliance toward others (Pfaff 2007).
Adolphs et al. (1996) found that damage to the amygdala
made it more difficult for individuals to correctly feel fear
and other negative emotions, like anger and sadness. Inter-
estingly, however, the ability to feel happiness was not
harmed (Adolphs et al. 1994). As Gazzaniga (2008)points
out, if damage to the amygdala makes us less able to feel an
emotion, then we are correspondingly less able to share it
with others. Indeed, the amygdala has also been found to be
central in fairness and emotionally weighted moral decision-
making (Blair 2007; Greene and Haidt 2002;Molletal.
2002). Along these lines, there is increasing evidence from
both neuroimaging and genetics studies that impairment of
the amygdala may be involved in the etiology of antisocial
(and even criminal) behavior (DeLisi et al. 2009).
Somatosensory Cortices
The human somatosensory cortices may also play an
important role in our emotionally shared responses to deontic
justice. These insights emerge from studies on pain. Indeed,
our ability to experience another’s pain is a key characteristic
of affective empathy (Singer et al. 2004). Singer et al. (2004)
assessed brain activity while research subjects underwent a
painful stimulus, and compared this activity to that elicited
when the same subjects observed their beloved partners (who
were present during the experiment) receiving similar pain
stimuli. While the first-hand experience of pain resulted in
the activation of the subjects’ somatosensory cortices, these
regions showed no significant activation in response to the
observation of the partners experiencing pain. The authors
concluded that the aCC and bilateral anterior insula were
affective mediators of empathy for pain, while the
somatosensory cortices were not. In another fMRI study
(Morrison et al. 2004), participants were presented images of
hands and feet in painful or neutral situations and asked to
envisage the level of pain that these conditions would pro-
duce. Once again, no signal change was detected in the
somatosensory cortices, while there were significant activa-
tions in the cingulate cortex and in the insula.
However, at odds with these findings is a Transcranial
Magnetic Stimulation (TMS) study in which individuals
observed needles penetrating hands or feet of a human
model and in objects. Avenanti et al. (2005) reported that the
observation of pain does involve sensorimotor representa-
tion. Specifically, the results showed empathic inference
about the sensory qualities of others’ pain together with an
embodiment in the research subjects’ motor systems. This
evidence is noteworthy because the TMS method is more
sensitive to detection of subtle changes in cortical activity
than fMRI techniques (Decety and Lamm 2006). Thus, this
evidence suggests that observing another individual in a
painful situation may yield ‘empathic responses’ in the
somatosensory cortices. Moreover, overlaps between first-
hand experiences of pain and perception of pain in others
seem to reveal some degree of correspondence between self
and others’ experiences (Decety and Lamm 2006).
There is more to add to this. The brain network involved
in the perception of pain in others is also implicated in
disgust and in situations involving risk. These are occur-
rences that spark visceral and somatosensory responses.
Similarly, activation in the somatosensory cortex is not
necessarily exclusive to the emotional appraisal of pain.
Hence, it seems likely that neural responses in these areas
are coupled with broader behavioral mechanisms, such as
aversion and retraction (e.g., Decety and Lamm 2006;
Singer and Lamm 2009), which are also typical of deontic
justice. In support of this view, aversive representations
similar to those observed in anticipation of, and response
to, negative outcomes trigger activation in the somatosen-
sory cortices (Bechara and Damasio 2005; Knutson and
Greer 2008; Shenhav and Greene 2010). Overall, these
cortical areas may play an important role in our sharing of
others’ feelings following a ‘painful’ and unfair situation.
Deontic Justice and Organizational Neuroscience
123
Inferior Frontal Gyrus (IFG)
Finally, one other brain region deserves our attention when
discussing affective empathy. Neuroimaging studies of
empathizing with people suffering serious threat or harm
(Nummenmaa et al. 2008) reported the involvement of the
Inferior Frontal Gyrus (IFG) in affective empathy (Sha-
may-Tsoory 2011). In particular, research has shown evi-
dence for the existence of mirror neurons in the human
IFG (Kilner et al. 2009). Mirror neurons are a class of
visuomotor neurons, originally discovered in an area of the
monkey premotor cortex, that are electrophysiologically
responsive when an individual observes another individual
performing a particular action and then does a similar
behavior (for a thorough review, see Rizzolatti and
Craighero 2004).
Despite the intense debate on this topic (for a primer, see
Keysers 2009), a growing number of neuroimaging studies
show that the IFG has activation peaks when a person sees
another person experiencing an emotion. This supports the
idea that this area could also be a principal neural site for
empathy. Specifically, IGF activation has been reported in
negative emotional responses and decision-making in
conditions of justice dilemmas (Majdandz
ˇic
´et al. 2012),
suggesting its relevance for deontic justice.
Cognitive and Affective Empathy Working Together
In normal healthy adults, cognitive and affective empathy,
whose main supporting neural systems are summarized in
Table 1, tend to work in concert (Pessoa 2014).
Both are engaged in moral behavior (Zaki and Ochsner
2012), with each playing important and complimentary
roles. In short, cognitive empathy allows us to appreciate
other people’s minds, including both their thoughts and
their emotions, while affective empathy allows us to share
their emotional experiences, softening the boundaries that
separate individuals (Pfaff 2007). We can illustrate this
idea further by considering evidence from clinical and
neuropsychological research. Koeings et al. (2007), for
instance, found that individuals with a damage to their
vmPFC, which prevented the normal processing of affec-
tive information, tended to make moral decisions in a
‘cold’ logical way, which these authors term ‘utilitarian.’
Similarly, psychopaths have reasonably sound abilities
to make cognitive inferences (Bru
¨ne and Bru
¨ne-Cohrs
2006), but appear to be emotionally disinterested in the
suffering of others (Baron-Cohen 2011). Thus, in psy-
chopathy only affective empathy is impaired, while cog-
nitive empathy is maintained and possibly heightened
(Blair 2005; Shamay-Tsoory et al. 2010). Autistic patients
instead seem to have deficiencies in both their cognitive
and affective empathy (Baron-Cohen and Wheelwright
2004). Notice that the moral comportment is impaired in
both psychopathy and autism, though the resulting patterns
of behavior are different for each condition. In any case,
moral thinking can be impeded if either type of empathy is
diminished (Moll et al. 2005). This suggests that the two
types of empathy help people to behave justly. This is
important for deontic justice in organizations because a
wide variety of organizational contexts (culture, training,
leadership, etc.) can influence the manifestation of each
type of empathy.
This said, when making fairness judgments regarding
others, cognitive and affective empathy work closely
in a ‘bi-directional fashion’ (see Fig. 1). Neuroscience
research has suggested that a dual-path—‘top-down’ and
‘bottom-up’—system may be relevant in processing
human morality and empathy (Table 2)(Walter2012;
Zhan et al. 2013).
This remark resonates with other neuroscience and
psychological perspectives on dual processing systems. For
instance, much debate has focused on how emotions arise
(Lazarus 1984; Zajonc 1984) via low-level processes—that
provide quick, bottom-up affective considerations of
stimuli—or via high-level—top-down cognitive appraisal
processes that draw upon stored knowledge. Speaking
generally, dual process mechanisms are believed to be
recurrent ways of processing information in the brain.
Similarly, cognitive empathy may act as the ‘high road’
or ‘top-down’ processing, while affective empathy as ‘low
road’ or ‘bottom-up’ processing of deontic justice. The
‘low road’ path essentially means that some basic features
of an individual, representing strong affective states or
suffering conditions (i.e., facial expressions, body move-
ments, or obvious features like injuries), trigger an auto-
matic response. Conversely, the ‘high road’ to empathy
suggests that it also relies on higher cognitive mechanisms,
like reasoning based on logical and contextual relations and
situations, that cascade in a top-down process.
Specifically, we argue that cognitive and affective
empathy are reinforcing mechanisms for each other, which
can differently elicit evaluations of justice rules. Thus, the
‘high road’ processing implies that workers are consciously
evaluating the mistreatment of others. By doing so, they
Table 1 Summary of main neural systems involved in cognitive and
affective empathy
Cognitive empathy Affective empathy
Temporoparietal junction Anterior insulae
Posteromedial cortex Amygdalae
Prefrontal cortex Somatosensory cortices
Cingulate cortex Inferior frontal gyrus
R. S. Cropanzano et al.
123
will consequently engage affective empathy. On the other
side, when empathy takes the ‘low road’ employees may
first experience affective empathy. This intuitive and
automatic process can then powerfully engage their cog-
nitive empathy.
Conclusions and Research Needs
While the two forms of empathy are experimentally dis-
sociable, we shall stress that they normally work in concert.
Moreover, neuroscience insights are gradually yielding
toward a merged perspective holding that both top-down
and bottom-up processes are involved and important for
our information processing (for affective processing, see
Ochsner et al. 2009; for justice processing, see Moll et al.
2005). In this way, knowledge of sequences of social
actions or events would be ‘filtered’ in brain regions like
the PFC and the cingulate cortex, in turn enabling emer-
gence of just behavior (Moll et al. 2005; Zhan et al. 2013).
Folger and Salvador (2008) and Folger and Glerum (2015)
have argued that researchers have paid insufficient atten-
tion to the role of intuitive, bottom-up ethical judgments.
These researchers appear to be making an important
argument that is consistent with the neuroscience evidence
we have reviewed. This would seem to be an important
direction for future business ethics research. In addition,
the distinction between cognitive and affective empathy is
important for deontic justice research in organizations
because a wide variety of organizational contexts (culture,
training, leadership, etc.) can influence the manifestation of
each type of empathy. Future research can draw on insights
and methods from neuroscience to investigate the unique
effects further.
Empathy’s Direct Effects on Behavior: Too Much
of a Good Thing?
We have thus far emphasized empathy as a complex
moderator of the relationship between an event and the
application of a justice rule. As we have discussed,
empathizing with others tends to make workers behave
more fairly (Patient and Skarlicki 2005). However, there is
more to the matter. When others are in distress, we are also
more likely to render empathic altruistic assistance (Batson
1995,2006; Masten et al. 2011). We have represented this
effect in Fig. 1by the inclusion of a direct link from
empathy to fairness. We explore this possibility below.
Empathy and the Potential for Preferential
Treatment
To better illustrate this case let us consider the work of
Batson et al. (1995). In their initial experiment, these
scholars had undergraduate research participants work with
two other students. In the critical conditions, research
subjects were read a note from one of the other individuals
in the experiment. The note came from a ‘Participant C.’
The experimenter instructed them to either ‘imagine how
this [Participant C] student feels,’ whereas others were told
to ‘take an objective perspective.’ The former instructors
promoted empathy, while the latter did not. The subject
was then given the opportunity to assign the other stu-
dents—either Participant C or Participant B—to a task with
positive consequences or else to a task with negative
consequences. When empathy had been induced for Par-
ticipant C, then the subject was more likely to assign that
individual to the positive task, at the expense of leaving the
negative task for the other student. These findings were
replicated in a second experiment. Thus, empathy directly
caused decision-makers to show unfair preferential treat-
ment. That is, while empathy (both cognitive and affective)
will often motivate us to behave altruistically (Batson
1995), it will not motivate us to treat everyone consistently.
Empathy-Induced Preferential Treatment
and Justice
Thus far, our account of empathy-induced preferential
treatment would suggest that employees who feel empathic
concern for another person would ignore issues of justice.
However, that does not seem to be the case—at least not
from the perspective of the individual enacting the
behavior. In an important contribution, Blader and Roth-
man (2014) present four studies which replicate the rela-
tionship between empathic concern and preferential
treatment. However, these authors offer two additional
insights.
Table 2 Possible role of the PFC in the ‘top-down’ and ‘bottom-up’ processing of justice and empathy
Process Role of PFC
Top-
down
The frontal cortex ‘represents’ moral goals that control the information flow in other cortical and subcortical areas when an
automatic response needs to be overcome (see e.g., Miller and Cohen 2001)
Bottom-
up
vmPFC stores links of subcortical ‘somatic markers’ and action knowledge in posterior brain areas. This would explain problems
with decision-making after brain lesions (see e.g., Bechara et al. 2000)
Deontic Justice and Organizational Neuroscience
123
First, they found that if a decision-maker was held
accountable by a third party, then he or she would not tend
to show preferential behavior when empathy was high.
Second, Blader and Rothman (2014) measured fairness
perceptions. They found that fairness mediated the effects
of empathy. That is, high-empathy participants believe that
they made fair choices, both when they exhibited prefer-
ential treatment (in the low accountability conditions) and
when they did not (in the high accountability conditions).
These findings are very important to our model (see also
Blader and Tyler 2001). Individuals who empathize believe
that they are behaving fairly, even when they show pref-
erential treatment. Or, one might say, even when they
violate rules of even-handed justice (e.g., consistent treat-
ment, equity). While this could be a significant problem for
organizations, Blader and Rothman (2014) offer a solution.
The disinterested perspective of an informed third party
may serve to counterbalance the biased viewpoint of an
empathic individual. Organizations may wish to design
interventions that include active third parties that monitor
decisions where their might be conflicts of interest created
by empathy (Bazerman and Tenbrunsel 2011).
Discussion
Historically, most theories of justice in business ethics have
argued that workers want to be fairly treated because it
benefits them, either through long-term instrumental con-
trol or else through enhancing their social status (Cropan-
zano et al. 2003; Folger and Butz 2004). In contrast to these
earlier models, deontic justice emphasizes the notion of
‘oughts’ (Folger and Glerum 2015). Other people should be
treated in a way that they deserve, in accordance with
standards of fairness (Folger 2001,2011). While evidence
favoring a deontic model of justice has been steadily
increasing (e.g., Colquitt et al. 2006; Folger et al. 2013;
Reb et al. 2006; Turillo et al. 2002), skepticism remains
(e.g., Colquitt and Greenberg 2001; Gillespie and Green-
berg 2005; Greenberg 2001).
We argue that this skepticism could be better addressed,
at least in part, by articulating a model of deontic justice,
supported by current neuroscience evidence, to explain
how a worker can transcend self-interest by being con-
cerned with the plight and needs of others (Folger and
Salvador 2008). To address this theoretical need we have
focused on three interrelated psychological processes and
reviewed supporting neural systems:
•Justice rules People sometimes interpret events in
ethical terms. As such, individuals distinguish between
practical, but somewhat arbitrary social conventions
(e.g., drive on the left side of the road in the United
Kingdom, but the right side in the United States), and
moral principles (i.e., Thou shalt not kill!). When
compared to social conventions, justice rules (a) tend
not to be based on social authority, and (b) their
violations warrant punishment (Smetana 1985,1989).
•Cognitive empathy This implies that individuals under-
stand and recognize the contents of others’ minds. That
is, they shall ‘cognitively comprehend’ what others are
thinking and feeling when victims of injustice.
•Affective empathy ‘Knowing’ is not the same as
‘caring.’ Individuals also emotionally connect and
share the affective state of potential victims of injustice.
When they experience the pain that other people feel
due to unfair treatment, it becomes more substantive
and important to them.
According to our model, we argue that deontic thinking
involves using a justice rule to make sense out of an
unfortunate situation for others. If the rule was violated,
then the worker is apt to conclude that unfairness occurred.
This rule is most likely to be applied when one experiences
cognitive and affective empathy toward the victim.
Though, at times, when empathy is strong, an individual
may bypass the justice rule and preferentially intervene to
help a colleague.
Theoretical and Practical Implications
In this paper we have outlined a novel theoretical model for
business ethics suggesting that deontic justice—our con-
cern with other people’s just treatment—arises from a set
of psychological processes which, to speak loosely, are
‘hard wired’ into our brains (Tancredi 2005). This also
suggests that justice—or its absence—could impact
employees on a more fundamental level than is often rec-
ognized. As alluded to earlier, workers who feel that they
have been unjustly treated experience poorer health
(Cropanzano and Wright 2011) and are likely to seek
revenge even when doing so is personally costly (Fehr and
Ga
¨chter 2000). This is not surprising, given the robust
processing and salience of fairness-related information.
A related insight for deontic justice theory is that,
while human beings are concerned with their self-interest,
they are also concerned with their moral principles (Fol-
ger 2001,2011).Theresearchevidencewehavereviewed
here supports these contentions. Justice, as well as ethics
more generally, is a central concern in human existence
(Cropanzano et al. 2007). For this reason, justice matters
beyond particular individuals. Third parties often care
about how others are treated even when they are not
directly impacted (Skarlicki and Kulik 2005). Given this,
the pernicious effects of injustice are likely to be spread
rapidly through an organization, as some employees
R. S. Cropanzano et al.
123
become displeased with the treatment and experiences of
their coworkers (Skarlicki et al. 2015). Indeed, justice in
its deontic form appears to be relevant beyond focal
employee-manager interactions: It may push employees to
exhibit negative behaviors and attitudes toward the
organization if they witness their peers encountering
moral adversity (Skarlicki et al. 1998). Hence, proactive
organizations should account for deontic justice as a part
of the organization’s overall culture, rather than through a
piecemeal series of one-on-one interventions (Monin et al.
2013). To achieve this goal, organizations could begin by
designing management systems that conform to justice
rules (Fortin and Fellenz 2008), such as providing voice
and respectful interpersonal treatment (Cuguero
´-Escofet
and Rosanas 2013). In this way, valuable behaviors, such
as organizational citizenship or whistleblowing will be
morelikelytooccur(Umphressetal.2010).
As explained throughout, incorporating neuroscience
evidence into accounts of organizational justice strongly
suggests that ethical decisions are often influenced by intu-
ition and affect, as well as by moral reasoning (e.g., Greene
2013). This observation has a somewhat different emphasis
than most traditional accounts of business ethics, which
focus on moral reasoning as opposed to more intuitive and
affective processes (cf. Salvador and Folger 2009). The
present model suggests that these two views of business
ethics should be re-balanced, because incorporating an
organizational neuroscience perspective reveals that implicit
empathic processes play a significant role in shaping the
observed behaviors in response to the unfair treatment of
others (Becker et al. 2011). For example, Masten et al.
(2011) showed that witnessing social mistreatment of others
required both forms of empathy to produce action. What is
more, the present model explains that empathy should be
considered as a dual construct rather than a single concept,
showing relevance of our model, not just for ethical think-
ing, but also to organizational research at large. Previous
research has sometimes considered empathy as either one
type (e.g., affective sharing) or another (e.g., cognitive
understanding). Supported by neuroscience research, we
maintain that this ‘either/or’ approach should be replaced
with ‘this and that.’ That is, there are two types of empathy
and both are important.
Thus, from a practical perspective, our conceptualiza-
tion may explain why training and other interventions to
influence organizational justice, which are often based on
rational ‘cold’ approaches, sometimes produce disap-
pointing results (Ludwig and Longenecker 1993). Among
other possibilities, our present model suggests that ethical
conduct at work could be enhanced by training people in
empathy (Pecukonis 1990). For instance, among health
care professionals, there have been promising findings with
these types of programs (e.g., LaMonica et al. 1976; Riess
et al. 2012), and we would recommend that they be con-
sidered more broadly. It also reinforces the value of com-
passionate organizational culture (Barsade and O’Neill
2014; Karakas and Sarigollu 2013).
Along these lines, neuroscience research itself provides
practical information on how deontic justice can be ‘ma-
nipulated’ in our brain. For one, Knoch et al. (2006)
employed Transcranial Magnetic Stimulation (TMS), a
non-invasive method used to inhibit small regions of the
brain by low-frequency stimulation (O’Shea and Walsh
2007) while subjects were playing the Ultimatum Game.
They found that one third of the participants whose right
dorsolateral PFC was inhibited, accepted all the offers,
even those clearly unfair. Similarly, the ‘manipulation’ of
socio-moral behavior has been explored with neuro-phar-
macological approaches, like those involving intranasal
administration of oxytocin (Kosfeld et al. 2005). Oxytocin-
treated subjects increased trust while performing a trust
game (i.e., they assigned more money to the trustees) if
compared to the control group. While the application of
these approaches has not yet been employed in the work-
place, their potential use will necessarily require a priori
and shared ethical guidelines ensuring real benefits for
workers.
Finally, we readily recognize that each of the different
neuroscience methods we have mentioned in our paper will
hold specific informative power in future testing of our
model. For one, if neuro-pharmacological approaches may
promote empathy and fairness, fMRI will be able to further
inform on the neural substrates activated during justice-
related experimental tasks or games. Yet, we hope that the
cross-disciplinary nature of our framework will not only
promote novel investigations on the highlighted brain
regions of interest, but will also encourage the integration
of this neuroscience information with more traditional
business research methods as a means to comprehensively
advance future research on deontic justice.
Limitations and Further Research Avenues
We should remark that in this work we have sought to
advance current theory on deontic justice by proposing an
interdisciplinary framework, rather than suggesting
replacement of existing accounts of organizational justice
exclusively with neuroscience research. Indeed, despite the
promises of neuroscience to advance organizational justice
research, we must point out a number of related limitations
and cautions. Neuroscience research in these areas is still in
its relative infancy (Zhan et al. 2013). As such, business
ethics scholars need to be wary of placing too much weight
on any single study or on a unique methodological
approach. Several methodological avenues will likely
contribute to advance the understanding of the neural
Deontic Justice and Organizational Neuroscience
123
substrates and psychological processes involved in deontic
justice. For instance, Electroencephalography (EEG) has
the potential to add ecological validity to deontic justice
research. For one, Stikic et al. (2013) have already assessed
engagement and leadership at both the individual and team
levels in a social responsibility scenario.
Other research has also shown that EEG investigations
can inform decision-making strategies (Jacobs et al. 2006)
and help to disentangle affective and cognitive processes
(Knyazev and Slobodskaya 2003; Pfurtscheller and Da
Silva 1999). Moreover, coherence analysis—a measure of
the degree to which EEG signals at two distinct scalp
locations are linearly related to one another—is often
associated with studies on individual traits (Harmon-Jones
et al. 2010), thus suggesting promising avenues to appre-
ciate individual differences in deontic justice.
Overall, the employment of interdisciplinary approaches
will provide a viable opportunity for researchers to move
the field forward. In parallel, we also recommend that
organizational neuroscience should attempt to offer further
meta-analytical evidence and ensure reproducibility of
existing findings. Only then will nuanced theoretical
business propositions and ecologically comprehensive
paradigms match a multidisciplinary effort to further refine
theoretical frameworks, such as the one presented here.
In concluding, our model suggests that there are at least
two broad paradigms for future research. The first of these
concerns issues regarding individual differences. It is a
point of everyday experience to recognize that some people
care more about morality than others (Shao et al. 2008).
Our review suggests that these differences can be reflected
in neural differences among people (Baron-Cohen 2011;
Kiehl 2006). Moreover, as deontic justice is heavily mod-
erated by both cognitive and affective constructs, individ-
uals’ impairments in any link of this chain will reduce
deontic justice in predictable ways. However, neuroscience
evidence suggests that situational context and conditions
are also important. For example, the cognitive empathy
system may ‘fail’ due to strong out-group attributions
(Haney et al. 1973), the framing of an ethical problem
(Bazerman and Tenbrunsel 2011), or the type of violation
(Greene et al. 2001). Future theoretical and empirical
extensions of our model shall explicitly take these cues
into account. In this regard, our present framework will
serve as a helpful opening roadmap.
Finally, while future research may yield novel findings
in this area, we suggest that organizations should always
attend closely to the work environment, designing it so that
organizational justice, as a whole, is not inadvertently
restrained by structural or procedural flaws. In this regard,
our model might also help inform future interventions. For
example, as much of the information that influences ethical
behavior is also processed outside of workers’ cognitive
awareness (i.e., affective empathy), organizations should
build cues into the environment that reinforce empathy
while avoiding signals that unfair behavior is accept-
able (Bazerman and Tenbrunsel 2011).
Conclusions
Our moral lapses (Batson 2006; Batson et al. 1997)and
self-deceptions (Batson et al. 1999) notwithstanding, deontic
justice suggests that justice is important for its own sake,
even when it does not directly serve our self-interest. This
appears to involve at least three psychological mechanisms:
Cognitive empathy, affective empathy, and our ability to
evaluate and apply moral rules. As seen, these processes are
associated with neural systems working together to form and
direct an internalized sense of deontic justice.
References
Aderman, D., Brehm, S. S., & Katz, L. B. (1974). Empathic
observation of an innocent victim: The just world revisited.
Journal of Personality and Social Psychology, 29, 342–347.
Adolphs, R., Damasio, H., Tranel, D., & Damasio, A. R. (1996).
Cortical systems for the recognition of emotion in facial
expressions. Journal of Neuroscience, 16, 7678–7687.
Adolphs, R., Tranel, D., Damasio, H., & Damasio, A. R. (1994).
Impaired recognition of emotion in facial expressions following
bilateral damage to the human amygdala. Nature, 372, 669–672.
Ambrose, M. L., Harland, L. K., & Kulik, C. T. (1991). Influence of
social comparisons on perceptions of organizational fairness.
Journal of Applied Psychology, 76, 239–246.
Ang, R. P., & Goh, D. H. (2010). Cyberbullying among adolescents:
The role of affective and cognitive empathy, and gender. Child
Psychiatry and Human Development, 41, 387–397.
Annas, J. (2011). Intelligent virtue. Oxford: Oxford University Press.
Aryee, S., & Chay, Y. W. (2001). Workplace justice, citizenship
behavior, and turnover intentions in a union context: Examining
the mediating role of perceived union support and union
instrumentality. Journal of Applied Psychology, 86, 154–160.
Augustine, J. R. (1996). Circuitry and functional aspects of the insular
lobe in primates including humans. Brain Research Reviews, 22,
229–244.
Avenanti, A., Bueti, D., Galati, G., & Aglioti, S. M. (2005).
Transcranial magnetic stimulation highlights the sensorimotor
side of empathy for pain. Nature Neuroscience, 8, 955–960.
Bagozzi, R. P., & Moore, D. J. (1994). Public service advertisements:
Emotions and empathy guide prosocial behavior. Journal of
Marketing, 58, 56–70.
Baron-Cohen, S. (2011). The science of evil: On empathy and the
origins of cruelty. New York: Basic Books.
Baron-Cohen, S., & Wheelwright, S. (2004). The empathy quotient:
an investigation of adults with Asperger syndrome or high
functioning autism, and normal sex differences. Journal of
Autism and Developmental Disorders, 34, 163–175.
Barsade, S. G., & O’Neill, O. A. (2014). What’s love got to do with
it? A longitudinal study of the culture of companionate love and
employee and client outcomes in the longterm care setting.
Administrative Science Quarterly, 59, 551–598.
R. S. Cropanzano et al.
123
Batson, C. D. (1995). Pro-social motivation: Why do we help others?
In A. Tesser (Ed.), Advanced social psychology (pp. 332–381).
Boston: McGraw-Hill.
Batson, C. D. (1999). Altruism and pro-social behavior. In D.
T. Gilbert, S. Fiske, & G. Lindzey (Eds.), The handbook of social
psychology (4th ed., Vol. 2, pp. 282–316). New York: McGraw-
Hill.
Batson, C. D. (2006). ‘‘Not all self-interest after all’’: Economics of
empathy-induced altruism. In D. De Cremer, M. Zeelenberg, &
J. K. Murnighan (Eds.), Social psychology and economics (pp.
281–299). Mahwah, NJ: Erlbaum.
Batson, C. D. (2009). These things called empathy: Eight related but
distinct phenomena. In J. Decety & I. William (Eds.), The social
neuroscience of empathy (pp. 3–15). Cambridge, MA: MIT
Press.
Batson, C. D., & Ahmad, N. (2001). Empathy-induced altruism in a
prisoner’s dilemma II: What if the target of empathy has
defected? European Journal of Social Psychology, 31, 25–36.
Batson, C. D., Batson, J. G., Todd, R. M., Brummett, B. H., Shaw, L.
L., & Aldeguer, C. M. R. (1995a). Empathy and the collective
good: Caring for one of the others in a social dilemma. Journal
of Personality and Social Psychology, 68, 619–631.
Batson, C. D., Fultz, J., & Schoenrade, P. A. (1987). Distress and
empathy: Two qualitatively distinct vicarious emotions with
different motivational consequences. Journal of Personality, 55,
19–39.
Batson, C. D., Klein, T. R., Highberger, K., & Shaw, L. L. (1995b).
Immorality from empathy-induced altruism: When compassion
and justice conflict. Journal of Personality and Social Psychol-
ogy, 68, 1042–1054.
Batson, C. D., Kobrynowicy, D., Dinnerstein, J. L., Kampf, H. C., &
Wilson, A. D. (1997). In a very different voice: Unmasking
moral hypocrisy. Journal of Personality and Social Psychology,
72, 1335–1348.
Batson, C. D., Tompson, E. R., Seuferling, G., Whitney, E., &
Strongman, J. A. (1999). Moral hypocrisy: Appealing moral to
oneself without being so. Journal of Personality and Social
Psychology, 77, 525–536.
Baxter, M. G., Parker, A., Lindner, C. C., Izquierdo, A. D., & Murray,
E. A. (2000). Control of response selection by reinforcer value
requires interaction of amygdala and orbital prefrontal cor-
tex. Journal of Neuroscience, 20, 4311–4319.
Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind spots: Why we
fail to do what’s right and what to do about it. Princeton, NJ:
Princeton University Press.
Bechara, A., & Damasio, A. R. (2005). The somatic marker
hypothesis: A neural theory of economic decision. Games and
Economic Behavior, 52, 336–372.
Bechara, A., Damasio, H., Tranel, D., & Damasio, A. R. (1997).
Deciding advantageously before knowing the advantageous
strategy. Science, 275, 1293–1295.
Bechara, A., Tranel, D., & Damasio, H. (2000). Characterization of
the decision-making deficit of patients with ventromedial
prefrontal cortex lesions. Brain, 123, 2189–2202.
Becker, W. J., Cropanzano, R., & Sanfey, A. G. (2011). Organiza-
tional Neuroscience: Taking Organizational Theory Inside the
Neural Black Box. Journal of Management, 37, 933–961.
Bernhardt, B. C., & Singer, T. (2012). The neural basis of empathy.
Annual Review of Neuroscience, 35, 1–23.
Beugre
´, C. D. (2009). Exploring the neural basis of fairness: A model
of neuro-organizational justice. Organizational Behavior and
Human Decision Processes, 110, 129–139.
Biswal, B., Zerrin Yetkin, F., Haughton, V. M., & Hyde, J. S. (1995).
Functional connectivity in the motor cortex of resting human
brain using echo-planar MRI. Magnetic Resonance in Medicine,
34, 537–541.
Blader, S. L., & Rothman, N. B. (2014). Paving the road to
preferential treatment with good intentions: Empathy, account-
ability, and fairness. Journal of Experimental Social Psychology,
50, 65–81.
Blader, S. L., & Tyler, T. R. (2001). Justice and empathy: What
motivates people to help others? In M. Ross & D. T. Miller
(Eds.), The justice motive in everyday life (pp. 226–250). New
York: Cambridge University Press.
Blader, S. L., & Tyler, T. R. (2015). Relational models of procedural
justice. In R. S. Cropanzano & M. Ambrose (Eds.), The Oxford
handbook of justice in work organizations (pp. 351–370).
Oxford: Oxford University Press.
Blair, R. J. R. (2005). Responding to the emotions of others:
Dissociating forms of empathy through the study of typical and
psychiatric populations. Consciousness and Cognition, 14,
698–718.
Blair, R. J. R. (2007). The amygdala and ventromedial prefrontal
cortex in morality and psychopathy. Trends in Cognitive
Sciences, 11, 387–392.
Bloom, P. (2013). Just babies: The origin of good and evil. New
York: Crowne Publishing.
Bru
¨ne, M., & Bru
¨ne-Cohrs, U. (2006). Theory of mind—evolution,
ontogeny, brain mechanisms and psychopathology. Neuro-
science and Biobehavioral Reviews, 30, 437–455.
Bush, G., Luu, P., & Posner, M. I. (2000). Cognitive and emotional
influences in anterior cingulate cortex. Trends in Cognitive
Sciences, 4, 215–222.
Carrington, S. J., & Bailey, A. J. (2009). Are there theory of mind
regions in the brain? A review of the neuroimaging literature.
Human Brain Mapping, 30, 2313–2335.
Casebeer, W. D. (2003). Moral cognition and its neural constituents.
Nature Reviews Neuroscience, 4, 840–847.
Cauda, F., D’Agata, F., Sacco, K., Duca, S., Geminiani, G., &
Vercelli, A. (2011). Functional connectivity of the insula in the
resting brain. Neuroimage, 55, 8–23.
Cauda, F., Geminiani, G., D’Agata, F., Sacco, K., Duca, S., Bagshaw,
A. P., & Cavanna, A. E. (2010). Functional connectivity of the
posteromedial cortex. PLoS ONE, 5, e13107.
Cavanna, A. E., & Trimble, M. R. (2006). The precuneus: a review of
its functional anatomy and behavioural correlates. Brain, 129,
564–583.
Christoff, K., & Gabrieli, J. D. (2000). The frontopolar cortex and
human cognition: Evidence for a rostrocaudal hierarchical
organization within the human prefrontal cortex. Psychobiology,
28, 168–186.
Churchland, P. S. (2011). Braintrust: What neuroscience tells us
about morality. Princeton, NJ: Princeton University Press.
Ciaramelli, E., Muccioli, M., La
`davas, E., & di Pellegrino, G. (2007).
Selective deficit in personal moral judgment following damage
to ventromedial prefrontal cortex. Social Cognitive & Affective
Neuroscience, 9, 84–92.
Clayton, S., & Opotow, S. (2003). Justice and identity: Changing
perspectives on what is fair. Personality and Social Psychology
Review, 7, 298–310.
Cohen, D., & Strayer, J. (1996). Empathy in conduct-disordered and
comparison youth. Developmental Psychology, 32, 988–998.
Cohen-Charash, Y., & Spector, P. E. (2001). The role of justice in
organizations: A meta-analysis. Organizational Behavior and
Human Decision Processes, 86, 278–321.
Colquitt, J. A. (2001). On the dimensionality of organizational justice:
A construct validation of a measure. Journal of Applied
Psychology, 86, 386–400.
Colquitt, J. A., Conlon, D. E., Wesson, M. J., Porter, C. O., & Ng, K.
Y. (2001). Justice at the millennium: A meta-analytic review of
25 years of organizational justice research. Journal of Applied
Psychology, 86, 425–445.
Deontic Justice and Organizational Neuroscience
123
Colquitt, J. A., & Greenberg, J. (2001). Doing justice to organiza-
tional justice: Forming and applying fairness judgments. In S.
Gilliland, D. Steiner, & D. Skarlicki (Eds.), Theoretical and
cultural perspectives on organizational justice (pp. 217–242).
Greenwich, CT: JAI.
Colquitt, J. A., & Rodell, J. B. (2015). Measuring justice and fairness.
In R. Cropanzano & M. A. Ambrose (Eds.), Oxford handbook of
justice in work organizations (pp. 187–202). Oxford: Oxford
University Press.
Colquitt, J. A., Scott, B. A., Judge, T. A., & Shaw, J. C. (2006).
Justice and personality: Using integrative theories to derive
moderators of justice effects. Organizational Behavior and
Human Decision Processes, 100, 110–127.
Craig, A. D. (2009). How do you feel—now? The anterior insula and
human awareness. Nature Reviews Neuroscience, 10, 59–70.
Critchley, H. D., Wiens, S., Rotshtein, P., O
¨hman, A., & Dolan, R. J.
(2004). Neural systems supporting interoceptive awareness.
Nature Neuroscience, 7, 189–195.
Cropanzano, R., Byrne, Z. S., Bobocel, D. R., & Rupp, D. R. (2001).
Moral virtues, fairness heuristics, social entities, and other
denizens of organizational justice. Journal of Vocational
Behavior, 58, 164–209.
Cropanzano, R., Fortin, M., & Kirk, J. (2015). How do we know when
we are treated fairly? Justice rules and fairness judgments. In J.
R. B. Halbeslesben, A. Wheeler, & M. R. Buckley (Eds.),
Research in personnel and human resources management (Vol.
33, pp. 279–350). Amsterdam: Elsevier.
Cropanzano, R., Goldman, B., & Folger, R. (2003). Deontic justice:
The role of moral principles in workplace fairness. Journal of
Organizational Behavior, 24, 1019–1024.
Cropanzano, R., & Moliner, C. (2013). Hazards of justice: Egocentric
bias, moral judgments, and revenge-seeking. In S. M. Elias (Ed.),
Deviant and criminal behavior in the workplace (pp. 155–177).
New York: New York University Press.
Cropanzano, R., Stein, J., & Goldman, B. M. (2007). Self-interest. In
E. H. Kessler & J. R. Bailey (Eds.), Handbook of organizational
and managerial wisdom (pp. 181–221). Los Angeles, CA: Sage
Publications.
Cropanzano, R., & Wright, T. A. (2011). The impact of organizational
justice on occupational health. In J. C. Quick & L. E. Tetrick
(Eds.), Handbook of occupational health psychology (pp.
205–219). Washington, DC: American Psychological Association.
Cuguero
´-Escofet, N., & Fortin, M. (2014). One justice or two? A
model of reconciliation of normative justice theories and
empirical research on organizational justice. Journal of Business
Ethics, 124, 435–451.
Cuguero
´-Escofet, N., & Rosanas, J. M. (2013). The just design and
use of management control systems as requirements for goal
congruence. Management Accounting Review, 24, 23–40.
DaGloria, J., & DeRidder, R. (1977). Aggression in dyadic interac-
tion. European Journal of Social Psychology, 7, 189–219.
DaGloria, J., & DeRidder, R. (1979). Sex differences in aggression:
Are current notions misleading? European Journal of Social
Psychology, 9, 49–66.
Damasio, A. R. (1996). The somatic marker hypothesis and the
possible functions of the prefrontal cortex. Philosophical
Transactions of the Royal Society of London. Series B, 351,
1413–1420.
Damasio, A. R., Tranel, D., & Damasio, H. (1990). Individuals with
sociopathic behavior caused by frontal damage fail to respond
autonomically to social stimuli. Behavioral Brain Research, 41,
81–94.
D’Arms, J. (1998). Empathy and evaluative inquiry. Chicago-Kent
Law Review, 74,4.
Davis, M. H. (1994). Empathy: A social psychological approach.
Boulder, CO: Westview Press.
De Waal, F. B. (2008). Putting the altruism back into altruism: The
evolution of empathy. Annual Review of Psychology, 59, 279–300.
Decety, J., & Jackson, P. L. (2006). A social-neuroscience perspective
on empathy. Current Directions in Psychological Science, 15,
54–58.
Decety, J., & Lamm, C. (2006). Human empathy through the lens of
social neuroscience. The Scientific World Journal, 6, 1146–1163.
Decety, J., & Lamm, C. (2007). The role of the right temporoparietal
junction in social interaction: how low-level computational
processes contribute to meta-cognition. The Neuroscientist, 13,
580–593.
DeLisi, M., Umphress, Z. R., & Vaughn, M. G. (2009). The
criminology of the amygdala. Criminal Justice and Behavior, 36,
1241–1252.
DeRidder, R. (1985). Normative considerations in the labeling of
harmful behavior as aggressive. Journal of Social Psychology,
125, 659–666.
Deutsch, M. (1975). Equity, equality, and need: What determines
which value will be used as the basis of distributive justice?
Journal of Social Issues, 31, 137–149.
Deutsch, M. (1985). Distributive justice. New Haven, CT: Yale
University Press.
Devinsky, O., Morrell, M. J., & Vogt, B. A. (1995). Contributions of
anterior cingulate cortex to behaviour. Brain, 18, 279–306.
Dulebohn, J. H., Conlon, D. E., Sarinopoulos, I., Davison, R. B., &
McNamara, G. (2009). The biological bases of unfairness:
Neuroimaging evidence for the distinctiveness of procedural and
distributive justice. Organizational Behavior and Human Deci-
sion Processes, 110, 140–151.
Eisenberg, N., & Fabes, R. A. (1990). Empathy: Conceptualization,
measurement, and relation to prosocial behavior. Motivation and
Emotion, 14, 131–149.
Fan, Y., Duncan, N. W., de Greck, M., & Northoff, G. (2011). Is there
a core neural network in empathy? An fMRI based quantitative
meta-analysis. Neuroscience and Biobehavioral Reviews, 35,
903–911.
Fanselow, M. S., & Gale, G. D. (2003). The amygdala, fear, and
memory. Annals of the New York Academy of Sciences, 985,
125–134.
Fassina, N. E., Jones, D. A., & Uggerslev, K. L. (2008). Meta-analytic
tests of relationships between organizational justice and citizen-
ship behavior: Testing agent-system and shared-variance mod-
els. Journal of Organizational Behavior, 29, 805–828.
Fehr, E., & Fishbacher, U. (2004). Third party punishment and social
norms. Evolution and Human Behavior, 25, 63–87.
Fehr, E., Fishbacher, U., & Ga
¨chter, S. (2002). Strong reciprocity,
human cooperation and the enforcement of social norms. Human
Nature, 13, 1–25.
Fehr, E., & Ga
¨chter, S. (2000). Cooperation and punishment in public
goods experiments. American Economic Review, 90, 980–994.
Fehr, E., & Ga
¨chter, S. (2002). Altruistic punishment in humans.
Nature, 415, 137–140.
Folger, R. (2001). Fairness as deonance. In S. W. Gilliland, D.
D. Steiner, & D. P. Skarlicki (Eds.), Research in social issues in
management: Theoretical and cultural perspectives on organi-
zational justice (Vol. 1, pp. 3–31). Charlotte: Information Age.
Folger, R. (2011). Deonanace: Behavioral ethics and moral obliga-
tion. In D. DeCremer & A. E. Tenbrunsel (Eds.), Series in
organization and management: Behavioral business ethics:
shaping an emerging field (pp. 123–142). New York: Routledge.
Folger, R., & Butz, R. (2004). Relational models, ‘‘deonance’’, and
moral antipathy toward the powerfully unjust. In N. Haslam
(Ed.), Relational models theory: A contemporary overview (pp.
217–242). Mahwah, NJ: Lawrence Erlbaum.
Folger, R., & Cropanzano, R. (1998). Organizational justice and
human resource management. Beverly Hills, CA: Sage.
R. S. Cropanzano et al.
123
Folger, R., & Cropanzano, R. (2001). Fairness theory: Justice as
accountability. In J. Greenberg & R. Cropanzano (Eds.),
Advances in organizational justice (pp. 1–55). Stanford, CA:
Stanford University Press.
Folger, R., Cropanzano, R., & Goldman, B. (2005). Justice, account-
ability, and moral sentiment: The deontic response to ‘‘foul
play’’ at work. In J. Greenberg & J. Colquitt (Eds.), Handbook of
organizational justice (pp. 215–245). Mahwah, NJ: Erlbaum.
Folger, R., Ganegoda, D. B., Rice, D. B., Taylor, R., & Wo, D. X.
(2013). Bounded autonomy and behavioral ethics: Deonance and
reactance as competing motives. Human Relations, 66, 905–924.
Folger, R., & Glerum, D. R. (2015). Justice and deonance: ‘‘You
ought to be fair’’. In R. Cropanzano & M. A. Ambrose (Eds.),
Oxford handbook of justice in work organizations (pp. 331–350).
Oxford: Oxford University Press.
Folger, R., & Salvador, R. (2008). Is management theory too ‘‘self-
ish’’? Journal of Management, 34, 1127–1151.
Fortin, M., & Fellenz, M. R. (2008). Hypocrisies of fairness: Towards
a more reflexive ethical base in organizational justice research
and practice. Journal of Business Ethics, 78, 415–433.
Fransson, P. (2005). Spontaneous low-frequency BOLD signal
fluctuations: an fMRI investigation of the resting-state default
mode of brain function hypothesis. Human Brain Mapping, 26,
15–29.
Friedrichs, D. O. (2010). Trusted criminals: White collar crime in
contemporary society. Belmont, CA: Wadsworth.
Frith, C. D., & Singer, T. (2008). The role of social cognition in
decision making. Philosophical Transactions of the Royal
Society B: Biological Sciences, 363, 3875–3886.
Fuster, J. M. (2001). The prefrontal cortex—An update: Time is of the
essence. Neuron, 30, 319–333.
Gazzaniga, M. S. (2008). Human: The science behind what makes us
unique. New York: Ecco.
Gillespie, J. Z., & Greenberg, J. (2005). Are the goals of organiza-
tional justice self-interested? In J. Greenberg & J. A. Colquitt
(Eds.), Handbook of organizational justice (pp. 179–213).
Mahwah, NJ: Lawrence Erlbaum.
Goldman-Rakic, P. S. (1987). Circuitry of primate prefrontal cortex
and regulation of behavior by representational memory. In F.
Plum &V. B. Mountcastle (Eds.), Handbook of physiology:
Section 1. The nervous system: Vol. 5. Higher functions of the
brain (pp. 373–417). Bethesda, MD: American Physiological
Society.
Gray, J. R., Braver, T. S., & Raichle, M. E. (2002). Integration of
emotion and cognition in the lateral prefrontal cortex. Proceed-
ings of the National Academy of Sciences, 99, 4115–4120.
Greenberg, J. (1990). Organizational justice: Yesterday, today, and
tomorrow. Journal of Management, 16, 399–432.
Greenberg, J. (2001). Setting the justice agenda: Seven unanswered
questions about ‘‘what, why, and how.’’. Journal of Vocational
Behavior, 58, 210–219.
Greene, J. D. (2013). Moral tribes: Emotion, reason, and the gap
between us and them. New York: Penguin.
Greene, J. D., & Haidt, J. (2002). How (and where) does moral
judgment work? Trends in Cognitive Science, 6, 517–523.
Greene, J. D., Nystrom, L. E., Engell, A. D., Darley, J. M., & Cohen,
J. D. (2004). The neural bases of cognitive conflict and control in
moral judgment. Neuron,44, 389–400.
Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., &
Cohen, J. D. (2001). An fMRI investigation of emotional
engagement in moral judgment. Science, 293, 2105–2108.
Haidt, J. (2003). The moral emotions. In R. J. Davidson, K.
R. Scherer, & H. H. Goldsmith (Eds.), Handbook of affective
sciences (pp. 852–870). Oxford: Oxford University Press.
Haidt, J. (2006). The happiness hypothesis: Finding modern trust in
ancient wisdom. New York: Basic Books.
Haidt, J., & Joseph, C. (2007). The moral mind: How five sets of
innate intuitions guide the development of many culture-specific
virtues, and perhaps even modules. In P. Carruthers, S. Lau-
rence, & S. Stich (Eds.), The innate mind (Vol. 3, pp. 367–392).
New York: Oxford University Press.
Hamlin, J. K., & Wynn, K. (2011). Five- and 9-month olds prefer pro-
social to antisocial others. Cognitive Development, 26, 30–39.
Hamlin, J. K., Wynn, K., & Bloom, P. (2010). 3-month olds show a
negativity bias in social evaluation. Developmental Science, 13,
923–939.
Haney, C., Banks, W. C., & Zimbardo, P. G. (1973). Study of
prisoners and guards in a simulated prison. Naval Research
Reviews, 9, 1–17.
Hannah, S. T., Jennings, P. L., Bluhm, D., Peng, A. C., &
Schaubroeck, J. M. (2014). Duty orientation: Theoretical devel-
opment and preliminary construct testing. Organizational
Behavior and Human Decision Processes, 123, 220–238.
Harmon-Jones, E., Gable, P. A., & Peterson, C. K. (2010). The role of
asymmetric frontal cortical activity in emotion-related phenom-
ena: A review and update. Biological Psychology, 84, 451–462.
Hatfield, E., Walster, G. W., & Piliavvin, J. A. (1978). Equity theory
and helping relationships. In L. Wispe (Ed.), Altruism, sympathy,
and helping: Psychological and sociological perspectives (pp.
115–139). New York: Academic Press.
Hein, G., Silani, G., Preuschoff, K., Batson, C. D., & Singer, T.
(2010). Neural responses to ingroup and outgroup members’
suffering predict individual differences in costly helping.
Neuron, 68, 149–160.
Hewitt, L. S. (1975). The effects of provocation, intentions and
consequences on children’s moral judgments. Child Develop-
ment, 46, 540–544.
Hoffman, M. L. (1994). The contribution of empathy to justice and
moral judgment. In B. Puka (Ed.), Reaching out: Caring,
altruism, and prosocial behavior (Vol. 7, pp. 161–195). New
York: Garland.
Hoffman, M. L. (2000). Empathy and moral development. Cam-
bridge: Cambridge University Press.
Hogan, R. (1969). Development of an empathy scale. Journal of
Consulting and Clinical Psychology, 33, 307–316.
Hollensbe, E. C., Khazanchi, S., & Masterson, S. S. (2008). How do I
assess if my supervisor and organization are fair? Identifying the
rules underlying the entity-based justice perceptions. Academy of
Management Journal, 51, 1099–1116.
Jacobs, J., Hwang, G., Curran, T., & Kahana, M. J. (2006). EEG
oscillations and recognition memory: theta correlates of memory
retrieval and decision making. Neuroimage, 32, 978–987.
Judge, T. A., Scott, B. A., & Ilies, R. (2006). Hostility, job attitudes,
and workplace deviance: Test of a multilevel model. Journal of
Applied Psychology, 91, 126–138.
Karakas, F., & Sarigollu, E. (2013). The role of leadership in creating
virtuous and compassionate organizations: Narratives of benev-
olent leadership in an Anatolian tiger. Journal of Business
Ethics, 113(4), 663–678.
Karniol, R. (1978). Children’s use of intention cues in evaluating
behavior. Psychological Bulletin, 85, 76–85.
Keysers, C. (2009). Mirror neurons. Current Biology, 19, R971–
R973.
Kiehl, K. A. (2006). A cognitive neuroscience perspective on
psychopathy: evidence for paralimbic system dysfunction.
Psychiatry Research, 142, 107–128.
Kilner, J. M., Neal, A., Weiskopf, N., Friston, K. J., & Frith, C. D.
(2009). Evidence of mirror neurons in human inferior frontal
gyrus. Journal of Neuroscience, 29, 10153–10159.
Knoch, D., Pascual-Leone, A., Meyer, K., Treyer, V., & Fehr, E.
(2006). Diminishing reciprocal fairness by disrupting the right
prefrontal cortex. Science, 314(5800), 829–832.
Deontic Justice and Organizational Neuroscience
123
Knutson, B., & Greer, S. M. (2008). Anticipatory affect: neural
correlates and consequences for choice. Philosophical Transac-
tions of the Royal Society B: Biological Sciences, 363, 3771–3786.
Knyazev, G. G., & Slobodskaya, H. R. (2003). Personality trait of
behavioral inhibition is associated with oscillatory systems
reciprocal relationships. International Journal of Psychophysi-
ology, 48, 247–261.
Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman, F.,
Hauser, M., & Damasio, A. (2007). Damage to the prefrontal
cortex increases utilitarian moral judgments. Nature, 446,
908–911.
Kosfeld, M., Heinrichs, M., Zak, P. J., Fischbacher, U., & Fehr, E.
(2005). Oxytocin increases trust in humans. Nature, 435(7042),
673–676.
Krolak-Salmon, P., Henaff, M. A., Isnard, J., Tallon-Baudry, C.,
Guenot, M., Vighetto, A., et al. (2003). An attention modulated
response to disgust in human ventral anterior insula. Annual
Review of Neuroscience, 53, 446–453.
Lamm, C., Batson, C. D., & Decety, J. (2007). The neural substrate of
human empathy: effects of perspective-taking and cognitive
appraisal. Journal of Cognitive Neuroscience, 19, 42–58.
LaMonica, E. L., Carew, D. K., Winder, A. E., Haase, A. M. B., &
Blanchard, K. (1976). Empathy training as a major thrust of a
staff development program. Nursing Research, 25, 403–434.
Lau, V., & Wong, Y. (2009). Direct and multiplicative effects of
ethical dispositions and ethical climates on personal justice
norms: A virtue ethics perspective. Journal of Business Ethics,
90, 279–294.
Lazarus, R. S. (1984). On the primacy of cognition. American
Psychologist, 39, 124–129.
Lee, S. M., Gao, T., & McCarthy, G. (2014). Attributing intentions to
random motion engages the posterior superior temporal sulcus.
Social Cognitive and Affective Neuroscience, 9, 81–87.
Lerner, M. J., & Goldberg, J. H. (1999). When do decent people
blame victims? In S. Chaiken & Y. Trope (Eds.), Dual process
theories in social psychology (pp. 627–640). New York:
Guilford.
Leventhal, G. S. (1980). What should be done with equity theory?:
New approaches to the study of fairness in social relationships.
In K. J. Gergen, M. S. Greenberg, & R. H. Willis (Eds.), Social
exchange: Advances in theory and practice (pp. 27–55). New
York: Plenum Press.
Lind, E. A., & Tyler, T. R. (1988). The social psychology of
procedural justice. New York: Plenum Press.
Ludwig, D. C., & Longenecker, C. O. (1993). The Bathsheba
syndrome: The ethical failure of successful leaders. Journal of
Business Ethics, 12, 265–273.
Luo, Q., Nakic, M., Wheatley, T., Richell, R., Martin, A., & Blair, R.
J. R. (2006). The neural basis of implicit moral attitude—an IAT
study using event-related fMRI. Neuroimage, 30, 1449–1457.
Lyons, D. E., Santos, L. R., & Keil, F. C. (2006). Reflections of other
minds: how primate social cognition can inform the function of
mirror neurons. Current Opinion in Neurobiology, 16, 230–234.
Maddock, R. J., Garrett, A. S., & Bounocore, M. H. (2003). Posterior
cingulate cortex activation by emotional words: fMRI evidence
from a valance decision task. Human Brain Mapping, 18, 30–41.
Majdandz
ˇic
´, J., Bauer, H., Windischberger, C., Moser, E., Engl, E., &
Lamm, C. (2012). The human factor: Behavioral and neural
correlates of humanized perception in moral decision making.
PLoS ONE, 7, e47698.
Massaro, S. (2015). Neuroscientific Methods Applications in Strategic
Management. In G. Dagnino & C. Cinci (Eds.), Strategic
management: A research method handbook (pp. 253–282). New
York: Routledge.
Massaro, S., & Becker, W. J. (2015). Organizational Justice through
the Window of Neuroscience. In D. A. Waldman, & P.
A. Balthazar (Eds.), Organizational Neuroscience (Monographs
in Leadership and Management, Volume 7) (pp. 257–276).
Bradford, UK: Emerald Group Publishing Limited.
Masten, C. L., Morelli, S. A., & Eisenberger, N. I. (2011). An fMRI
investigation of empathy for ‘social pain’ and subsequent
prosocial behavior. Neuroimage, 55, 381–388.
Mastrovito, D. (2013). Interactions between resting-state and task-
evoked brain activity suggest a different approach to fMRI
analysis. The Journal of Neuroscience, 33, 12912–12914.
Mazoyer, B., Zago, L., Mellet, E., Bricogne, S., Etard, O., Houde, O.,
& Tzourio-Mazoyer, N. (2001). Cortical networks for working
memory and executive functions sustain the conscious resting
state in man. Brain Research Bulletin, 54, 287–298.
Mesulam, M., & Mufson, E. J. (1982). Insula of the old world
monkey. Architectonics in the insulo-orbito-temporal component
of the paralimbic brain. Journal of Comparative Neurology, 212,
1–22.
Miller, E. K., & Cohen, J. D. (2001). An integrative theory of
prefrontal cortex function. Annual Review of Neuroscience, 24,
167–202.
Miller, D. T., & McCann, C. D. (1979). Children’s reactions to the
perpetrators and victims of injustices. Child Development, 50,
861–868.
Moll, J., & de Oliveira-Souza, R. (2007). Moral judgments, emotions
and the utilitarian brain. Trends in cognitive sciences, 11,
319–321.
Moll, J., de Oliveira-Souza, R., Bramati, I. E., & Grafman, J. (2002).
Functional networks in emotional moral and nonmoral social
judgments. Neuroimage, 16, 696–703.
Moll, J., Zahn, R., de Oliveira-Souza, R., Krueger, F., & Grafman, J.
(2005). The neural basis of human moral cognition. Nature
Reviews Neuroscience, 6, 799–809.
Monin, P., Noorderhaven, N., Vaara, E., & Kroon, D. (2013). Giving
sense to and making sense of justice in post-merger integration.
Academy of Management Journal, 56, 256–284.
Morrison, I., Lloyd, D., di Pellegrino, G., & Roberts, N. (2004).
Vicarious responses to pain in anterior cingulate cortex: Is
empathy a multisensory issue? Cognitive and Affective Behav-
ioral Neuroscience, 4, 270–278.
Nicklin, J. M., Greenbaum, R., McNail, L. A., Folger, R., & Williams,
J. K. (2011). The importance of contextual variables when
judging fairness: An examination of counterfactual thoughts and
fairness theory. Organizational Behavior and Human Decision
Processes, 114, 127–141.
Nucci, L. P., & Nucci, M. S. (1982). Children’s responses to moral
and social-conventional transgressions in free-play settings.
Child Development, 53, 1337–1342.
Nummenmaa, L., Hirvonen, J., Parkkola, R., & Hietanen, J. K.
(2008). Is emotional contagion special? An fMRI study on neural
systems for affective and cognitive empathy. Neuroimage, 43,
571–580.
O’Reilly, J., & Aquino, K. (2011). A model of third parties’ morally
motivated responses to mistreatment in organizations. Academy
of Management Review, 36, 526–543.
O’Reilly, J., Aquino, K., & Skarlicki, D. (2016). The lives of others:
Third parties’ responses t others’ injustice. Journal of Applied
Psychology, 101, 171–189.
Ochsner, K. N., Ray, R. R., Hughes, B., McRae, K., Cooper, J. C.,
Weber, J., …& Gross, J. J. (2009). Bottom-up and top-down
processes in emotion generation common and distinct neural
mechanisms. Psychological Science,20, 1322–1331.
O’Shea, J., & Walsh, V. (2007). Transcranial magnetic stimulation.
Current Biology, 17(6), R196–R199.
Patient, D. L., & Skarlicki, D. P. (2005). Why managers don’t always
do the right thing when delivering bad news: The roles of
empathy, self-esteem, and moral development in interactional
R. S. Cropanzano et al.
123
fairness. In S. W. Gilliland, D. D. Steiner, D. P. Skarlicki, & K.
van den Bos (Eds.), What motivates fairness in organizations?.
Greenwich, CT: JAI Press.
Patient, D. L., & Skarlicki, D. P. (2010). Increasing interpersonal and
informational justice when communicating negative news: The
role of the manager’s empathic concern and moral development.
Journal of Management, 36, 555–578.
Pecukonis,E. V. (1990). A cognitive/affective empathy training program
as a function of ego development. Adolescence, 25,59–76.
Pelphrey, K. A., Morris, J. P., & McCarthy, G. (2004). Grasping the
intentions of others: The perceived intentionality of an action
influences activity in the superior temporal sulcus during social
perception. Journal of Cognitive Neuroscience, 16, 1706–1716.
Pessoa, L. (2014). Pre
´cis of the cognitive-emotional brain. Behavioral
and Brain Sciences, 18, 1–66.
Pfaff, D. W. (2007). The neuroscience of fair play: Why we (usually)
follow the golden rule. New York: Dana Press.
Pfurtscheller, G., & Da Silva, F. L. (1999). Event-related EEG/MEG
synchronization and desynchronization: basic principles. Clini-
cal Neurophysiology, 110, 1842–1857.
Phan, K. L., Wager, T., Taylor, S. F., & Liberzon, I. (2002).
Functional neuroanatomy of emotion: a meta-analysis of emo-
tion activation studies in PET and fMRI. Neuroimage, 16,
331–348.
Phillips, R. G., & LeDoux, J. E. (1992). Differential contribution of
amygdala and hippocampus to cued and contextual fear condi-
tioning. Behavioral Neuroscience, 106, 274.
Preston, S. D., & De Waal, F. (2002). Empathy: Its ultimate and
proximate bases. Behavioral and Brain Sciences, 25, 1–20.
Raichle, M. E., MacLeod, A. M., Snyder, A. Z., Powers, W. J.,
Gusnard, D. A., & Shulman, G. L. (2001). A default mode of
brain function. Proceedings of the National Academy of
Sciences, 98, 676–682.
Reb, J., Goldman, B. M., Kray, L. J., & Cropanzano, R. (2006).
Different wrongs, different remedies? Reactions to organiza-
tional remedies after procedural and interactional injustice.
Personnel Psychology, 59, 31–64.
Riess, H., Kelly, J. M., Bailey, R. W., Dunn, E. J., & Phillips, M.
(2012). Empathy training for resident physicians: A randomized
controlled trail of a neuroscience-informed curriculum. Journal
of General Internal Medicine, 27, 1280–1286.
Rizzolatti, G., & Craighero, L. (2004). The mirror-neuron system.
Annual Review of Neuroscience, 27, 169–192.
Robertson, D., Snarey, J., Ousley, O., Harenski, K., Bowman, F. D.,
Gilkey, R., & Kilts, C. (2007). The neural processing of moral
sensitivity to issues of justice and care. Neuropsychologia, 45,
755–766.
Rolls, E. T. (1996). The orbitofrontal cortex. Philosophical Transac-
tions of the Royal Society of London. Series B, 351, 1433–1443.
Rolls, E. T., & Grabenhorst, F. (2008). The orbitofrontal cortex and
beyond: from affect to decision-making. Progress in Neurobi-
ology, 86, 216–244.
Rupp, D. E., & Paddock, E. L. (2010). From justice events to justice
climate: A multilevel temporal model of information aggregation
and judgment. Research on Managing Group and Teams, 13,
239–267.
Salvador, R., & Folger, R. G. (2009). Business ethics and the brain.
Business Ethics Quarterly, 19, 1–31.
Sanfey, A. G., Rilling, J. K., Aronson, J. A., Nystrom, L. E., & Cohen,
J. D. (2003). The neural basis of economic decision-making in
the ultimatum game. Science, 300, 1755–1758.
Saxe, R. (2006). Uniquely human social cognition. Current Opinion
in Neurobiology, 16, 235–239.
Saxe, R., & Wexler, A. (2005). Making sense of another mind: The
role of the right temporo-parietal junction. Neuropsychologia,
43, 1391–1399.
Saxe, R., Whitfield-Gabrieli, S., Scholz, J., & Pelphrey, K. A. (2009).
Brain Regions for Perceiving and Reasoning About Other People
in School-Aged Children. Child Development, 80, 1197–1209.
Schnall, S., Benton, J., & Harvey, S. (2008). With a clean conscience:
Cleanliness reduces the severity of moral judgments. Psycho-
logical Science, 19, 1219–1222.
Schnall, S., Haidt, J., Clore, G. L., & Jordan, A. H. (2009). Disgust as
embodied moral judgment. Personality and Social Psychology
Bulletin, 34, 1096–1109.
Scott, B. A., Colquitt, J. A., & Paddock, E. L. (2008). An actor-
focused model of justice rule adherence and violation: The role
of managerial motives and discretion. Journal of Applied
Psychology, 94, 756–769.
Shamay-Tsoory, S. G. (2011). The neural bases for empathy. The
Neuroscientist, 17, 18–24.
Shamay-Tsoory, S. G., & Aharon-Peretz, J. (2007). Dissociable
prefrontal networks for cognitive and affective theory of mind: A
lesion study. Neuropsychologia, 45, 3054–3067.
Shamay-Tsoory, S. G., Aharon-Peretz, J., & Perry, D. (2009). Two
systems for empathy: a double dissociation between emotional
and cognitive empathy in inferior frontal gyrus versus ventro-
medial prefrontal lesions. Brain, 132, 617–627.
Shamay-Tsoory, S. G., Harari, H., Aharon-Peretz, J., & Levkovitz, Y.
(2010). The role of the orbitofrontal cortex in affective theory of
mind deficits in criminal offenders with psychopathic tendencies.
Cortex, 46, 668–677.
Shamay-Tsoory, S. G., Tomer, R., Berger, B. D., & Aharon-Peretz, J.
(2003). Characterization of empathy deficits following prefrontal
brain damage: the role of the right ventromedial prefrontal
cortex. Journal of Cognitive Neuroscience, 15, 324–337.
Shao, R., Aquino, K., & Freeman, D. (2008). Beyond moral
reasoning: A review of moral identity research and its implica-
tions for business ethics. Business Ethics Quarterly, 18,
513–540.
Shenhav, A., & Greene, J. D. (2010). Moral judgments recruit
domain-general valuation mechanisms to integrate representa-
tions of probability and magnitude. Neuron, 67, 667–677.
Shulman, G. L., Fiez, J. A., Corbetta, M., Buckner, R. L., Miezin, F.
M., Raichle, M. E., & Petersen, S. E. (1997). Common blood
flow changes across visual tasks: II. Decreases in cerebral cortex.
Journal of Cognitive Neuroscience, 9, 648–663.
Singer, T. (2006). The neuronal basis and ontogeny of empathy and
mind reading: Review of literature and implications for future
research. Neuroscience and Biobehavioral Reviews, 30, 855–863.
Singer, T., & Lamm, C. (2009). The social neuroscience of empathy.
Annals of the New York Academy of Sciences, 1156, 81–96.
Singer, T., Seymour, B., O’Doherty, J., Kaube, H., Dolan, R. J., &
Frith, C. D. (2004). Empathy for pain involves the affective but
not sensory components of pain. Science, 303, 1157–1162.
Singer, T., Seymour, B., O’Doherty, J. P., Stephan, K. E., Dolan, R.
J., & Frith, C. D. (2006). Empathic neural responses are
modulated by the perceived fairness of others. Nature, 439,
466–469.
Skarlicki, D. P., Ellard, J. H., & Kelln, B. R. C. (1998). Third party
perceptions of a layoff: Procedural, derogation, and retributive
aspects of justice. Journal of Applied Psychology, 83, 119–127.
Skarlicki, D. P., & Folger, R. (2004). Broadening our understanding
of organizational retaliatory behaviors. In R. W. Griffin & A.
M. O’Leary-Kelly (Eds.), The dark side of organizational
behavior (pp. 373–402). New York: Jossey-Bass.
Skarlicki, D. P., Hoegg, J., Aguino, K., & Nadisic, T. (2013). Does
injustice affect your taste and smell? The mediating role of moral
disgust. Journal of Experimental Social Psychology, 49,
852–859.
Skarlicki, D. P., & Kulik, C. (2005). Third party reactions to
employee mistreatment: A justice perspective. In B. Staw & R.
Deontic Justice and Organizational Neuroscience
123
Kramer (Eds.), Research in organizational behavior (Vol. 26,
pp. 183–230). Greenwich, CT: JAI Press.
Skarlicki, D. P., O’Reilly, J., & Kulik, C. T. (2015). The third party
perspective on (in)justice. In R. Cropanzano & M. A. Ambrose
(Eds.), Oxford handbook of justice in work organizations (pp.
235–255). Oxford: Oxford University Press.
Skarlicki, D. P., & Rupp, D. E. (2010). Dual processing and
organizational justice: The role of rational versus experiential
processing in third-party reactions to workplace mistreatment.
Journal of Applied Psychology, 95, 944–952.
Skarlicki, D. P., van Jaarsveld, D. D., & Walker, D. D. (2008).
Getting even for customer mistreatment: The role of moral
identity in the relationship between customer interpersonal
justice and employee sabotage. Journal of Applied Psychology,
93, 1335–1347.
Smetana, J. G. (1981). Preschool children’s conceptions of moral and
social rules. Child Development, 52, 1333–1336.
Smetana, J. G. (1984). Toddlers’ social interactions regarding moral
and conventional transgressions. Child Development, 55,
1767–1776.
Smetana, J. G. (1985). Preschool children’s conceptions of trans-
gressions: The effects of varying moral and conventional
domain-related attributes. Developmental Psychology, 21,
18–29.
Smetana, J. G. (1989). Toddlers’ social interactions in the context of
moral and conventional transgressions in the home. Develop-
mental Psychology, 25, 499–508.
Smetana, J. G., Schlagman, N., & Adams, P. W. (1993). Preschool
children’s judgments about hypothetical and actual transgres-
sions. Child Development, 64, 202–214.
Stikic, M., Berka, C., Waldman, D., Balthazard, P., Pless, N., &
Maak, T. (2013). Neurophysiological estimation of team
psychological metrics. In Foundations of augmented cognition
(pp. 209–218). Berlin: Springer.
Suhler, C. L., & Churchland, P. (2011). Can innate, modular
‘‘foundations’’ explain morality? Challenges for Haidt’s moral
foundations theory. Journal of Cognitive Neuroscience, 23,
2103–2116.
Tancredi, L. (2005). Hardwired behavior: What neuroscience reveals
about morality. Cambridge: Cambridge University Press.
Terwogt, M. M., & Rieffe, C. (2003). Stereotyped beliefs about
desirability: implications for characterizing the child’s theory of
mind. New Ideas in Psychology, 21, 69–84.
Thomson, J. J. (1986). Rights, restitution and risk (pp. 94–116).
Cambridge: Harvard University Press.
Turillo, C. J., Folger, R., Lavelle, J. J., Umphress, E., & Gee, J.
(2002). Is virtue its own reward? Self-sacrificial decisions for the
sake of fairness. Organizational Behavior and Human Decision
Processes, 89, 839–865.
Tversky, A., & Kahneman, D. (1986). Rational choice and the
framing of decisions. Journal of business, 59, S251–S278.
Tyler, T. R. (1997). The psychology of legitimacy: A relational
perspective on voluntary deference to authorities. Personality
and Social Psychology Review, 1, 323–345.
Tyler, T. R. (2006). Why people open the law. Princeton, NJ:
Princeton University Press.
Tyler, T. R., & Blader, S. (2000). Cooperation in groups: Procedural
justice, social identity, and behavioral engagement. Philadel-
phia, PA: Psychology Press.
Tyler, T. R., Boeckmann, R. J., Smith, H. J., & Huo, Y. J. (1997).
Social justice in a diverse society. Boulder, CO: Westview.
Uddin, L. Q., Clare Kelly, A. M., Biswal, B. B., Xavier Castellanos,
F., & Milham, M. P. (2009). Functional connectivity of default
mode network components: correlation, anticorrelation, and
causality. Human Brain Mapping, 30, 625–637.
Umphress, E. E., Campbell, J. T., & Bingham, J. B. (2010). Paved
with good intentions: Unethical behavior conducted to benefit
the organization, coworkers, and customers. In M. Schminke
(Ed.), Managerial ethics: Managing the psychology of morality
(pp. 127–152). New York: Routledge.
Umpress, E. E., Simmons, A. L., Folger, R., Ren, R., & Bobocel, R.
(2013). Observer reactions to interpersonal injustice: The role of
perpetrator intent and victim perception. Journal of Organiza-
tional Behavior, 34, 327–349.
Van Overwalle, F., & Baetens, K. (2009). Understanding others’
actions and goals by mirror and mentalizing systems: a meta-
analysis. Neuroimage, 48, 564–584.
Walter, H. (2012). Social cognitive neuroscience of empathy:
Concepts, circuits, and genes. Emotion Review, 4, 9–17.
Wicker, B., Keysers, C., Plailly, J., Royet, J.-P., Gallese, V., &
Rizzolatti, G. (2003). Both of us disgusted in my insula: The
common neural basis of seeing and feeling disgust. Neuron, 40,
654–655.
Wispe
´, L. (1986). The distinction between sympathy and empathy: To
call forth a concept, a word is needed. Journal of Personality and
Social Psychology, 50, 314–321.
Yoder, K. J., & Decety, J. (2014). The Good, the Bad, and the Just:
Justice Sensitivity Predicts Neural Response during Moral
Evaluation of Actions Performed by Others. Journal of Neuro-
science, 34, 4161–4166.
Young, L., Cushman, F., Hauser, M., & Saxe, R. (2007). The neural
basis of the interaction between theory of mind and moral
judgment. Proceedings of the National Academy of Sciences,
104, 8235–8240.
Zajonc, R. B. (1984). On the primacy of affect. American Psychol-
ogist, 39, 117–123.
Zaki, J., & Ochsner, E. (2012). The neuroscience of empathy: Progress,
pitfalls, and promise. Nature Neuroscience, 15, 675–680.
Zhan, R., de Olivera-Souza, R., & Moll, J. J. (2013). Moral Emotions.
In J. Armony & P. Vuilleumier (Eds.), The Cambridge handbook
of human affective neuroscience. Cambridge: Cambridge Univer-
sity Press.
R. S. Cropanzano et al.
123
Access to this full-text is provided by Springer Nature.
Content available from Journal of Business Ethics
This content is subject to copyright. Terms and conditions apply.