ArticlePDF Available

Treating ethics as a design problem

a publication of the behavioral science & policy association 73
Treating ethics as a
design problem
Nicholas Epley & David Tannenbaum
Creating policies that encourage ethical behavior requires an accurate
understanding of what drives such behavior. We first describe three
common myths about the psychological causes of ethical behavior that
can lead policymakers to overlook constructive interventions. These
myths suggest that ethical behavior stems from a person’s beliefs;
changing behavior therefore requires changing beliefs. Behavioral
science, however, indicates that the immediate context (such as an
organization’s norms and accepted procedures) exerts a surprisingly
powerful influence on behavior. To be eective, policies must treat ethics
as a design problem; that is, policymakers should create contexts that
promote ethical actions. We then discuss three psychological processes
that aect ethical activity—attention, construal, and motivation—and
describe how understanding them can help policymakers in the public
and private sectors design environments that promote ethical behavior.
Epley, N., & Tannenbaum, D. (2017). Treating ethics as a design problem. Behavioral
Science & Polic y, 3(2), 73–84.
74 behavioral science & policy | volume 3 issue 2 2017
Effective policy design involves shaping
human behavior. In the public sector,
polic ym akers tr y to encour ag e some
behaviors and discourage others using tools
such as taxes, subsidies, mandates, bans, and
information campaigns. In the private sector,
policymakers try to shape behavior with tools
such as hiring, firing, co mp en sation, and
operations. Policymaking therefore involves
psychology—specifically, policymakers’ beliefs
about which levers are most effective for
changing behavior. Well-intended policies can
be ineective when based on erroneous beliefs
about human behavior.
Examples of failed policies based on flawed
assumptions are commonplace. In 2009, for
instance, the Transportation Security Adminis-
tration trained more than 3,000 employees to
read subtle verbal and nonverbal cues, assuming
that lies would “leak out” in brief interactions. In
fact, psychologists find very few reliable cues
to detecting deception during ongoing inter-
actions, and this TSA program produced a 99%
false alarm rate when evaluated by the Govern-
ment Accountability Oce.1 And in 2001, the
U.S. government distributed $38 billion in tax
rebates as part of an economic stimulus plan,
based on the belief that people would spend
more money when they had more to spend.2,3
In fact, consumer spending is guided by a host
of subjective evaluations about the source and
meaning of money. In this case, people over-
whelming saved these rebates, creating little
or no short-term stimulus,3 possibly because
people interpreted the re bates as returned
income rather than a windfall.4
Unfortunately, when it comes to considering
ethical behavior, policymakers routinely hold
imp er fect assumptions. Com mon intuition
presumes that people’s deeply held moral
beliefs and principles guide their behavior,
whereas behavioral science indicates that ethical
behavior also stems from momentary thoughts,
flexible interpretations, and the surrounding
social context. Common intuition treats the
challenge of influencing ethical behavior as a
problem of altering beliefs, whereas behavioral
science indicates that it should also be treated
as a design problem.
In this article, we describe three common myths
about morality that can lead policymakers to
design ineective interventions for enhancing
ethical behavior. We then discuss three basic
psycho logical processes that policymakers
in the public and private sectors can leverage
when designing behavioral interventions (see
Table 1). Understanding these processes can
help policymakers create environments that
encourage ethical behavior.
Of course, the very definition of ethical behavior
can lead to dis agreements and impasses
before anyone even gets to a discussion about
improving ethics. Here, we use the term to refer
to actions that aect others’ well-being. Ethical
behavior contains some degree of prosociality,
such as treating others with fairness, respect,
care, or concern for their welfare. In contrast,
unethical behavior contains some degree of
antisociality, including treating others unfairly,
disrespectfully, or in a harmful way. The inherent
complexity of social behavior—which involves
multiple people or groups in diverse contexts—
is largely why the causes of ethical behavior can
be so easily misunderstood in everyday life.
Three Myths About Morality
Common sense is based on everyday obser-
vation and guided by simplif ying heuristics.
These heuristics generally yield some degree
of accuracy in judgment but are also prone
to systematic mistakes . Comparing widely
accepted common sense with the empirical
record allows behavioral scientists to identify
systematic errors and propose interventions for
countering them.
Myth 1: Ethics Are a Property of People
All human behavior is produced by an enor-
mously complex string of causes, but common
sense often focuses on a single source: the
person engaging in the activity.5 This narrow
focus can lead to a simplified belief that uneth-
ical behavior is caused by unethical people
with unethical personalities—rogue traders ,
charlatans, or psychopaths—rather than by the
broader context in which that behavior occurs.
Core Findings
What is the issue?
Policymakers commonly
believe that they must first
change people’s beliefs in
order to encourage them
to adopt ethical behavior.
Beyond trying to change
beliefs, policymakers
should also treat ethics
as an environmental
problem and design
solutions that leverage
three key psychological
processes: at tention,
construa l, and motivation.
How can you act?
Selected recommendations
1) Designing compensation
strategies with prosocial
goals in mind, such as
tying an individual team
member’s bonus to
group performance
2) Counteracting
cognitive limitations
by engaging cognitive
repair practices such as
reminders, checklists,
and visible statements
Who should take
the lead?
Leaders and policymakers
in organizational design
and human resources,
behavioral science
researchers, organizational
a publication of the behavioral science & policy association 75
Perhaps the best-known example of this error
comes from Stanley Milgram’s experiments
on obedience to authority.6 Participants in
Mil gram’s experiments were ins tructed to
administer increasingly severe electric shocks
to another person, even to the point where
participants thought the shocks might have
been lethal (in fact, the “victim” was an actor
who never received any shocks). When Milgram
described this procedure to three different
samples of people, not one person predicted
that they would personally deliver the most
intense electric shock possible to another
person. In actualit y, 65% of participants did.
What makes Milgram’s research so interesting is
the mistaken intuition that only psychopaths or
very deviant personalities would be capable of
such obvious cruelty.
This myth implies that people tend to over-
estimate the stability of unethical behavior.
Consistent with this possibility, survey respon-
dents in one study dramatically overestimated
recidivism rates—the likelihood that a past crim-
inal would reoend—both over time and across
dierent crimes.7 The likelihood of reoending
actually drops dramatically over time, but partic-
ipants believed that it stays relatively constant.
Participants’ responses followed a rule of “once
a criminal, always a criminal,” a view consistent
with the myth that ethical behavior is a stable
property of individuals.8 Likewise, employers
who require credit checks as a precondition
for employment do so because they think past
defaults predict a broader tendency to engage
in a wide variety of unethical behaviors (such as
workplace deviance). In fact, empirical inves-
tigations have found that credit scores are,
at best, weakly associated with performance
appraisal ratings or termination decisions.9,10
Although largely unrecognized by the public,
the lack of correspondence between past and
future ethical behavior is not a new insight for
behavioral science. A classic study in which
psychologists evaluated thousands of high
school and middle school students in the
1920s found very little consistency in honesty
from one situation to another.11 People tend to
believe that ethical behavior reflects a consis-
tent moral character, but actual ethical behavior
varies substantially across contexts.
A focus on unethical individuals leads to poli-
cies that attempt to identify, detain, and deter
those individuals (for example, “rogue traders”).
This approach is unlikely to succeed when-
ever unethical behavior is systemic in nature
(for example, it occurs within a “rogue culture”
or “rogue industry”). Improving ethics often
requires altering the type of situation a person
is in, not simply altering the type of people in a
given situation.
Table 1. Myths about morality
Belief in the myths below can diminish a policymaker’s ability to maximize ethical behavior.
Myth Policy implication
Ethics are a property of people
Unethical behavior is largely due to unethical individuals rather
than the broader context in which behavior operates.
Can lead policymakers to overestimate the stability of ethical
behavior and endorse policies to identify, detain, and deter
unethical individuals (for example, “rogue traders”). Such
policies are unlikely to succeed whenever unethical behavior
is systemic in nature (encouraged by a “rogue” culture or
Intentions guide ethical actions
Good intentions lead to ethical acts, and unethical intentions
lead to unethical acts. Consequently, one should infer that
unethical behavior stems from unethical intentions.
Can encourage policymakers to view safeguards as
unnecessary for people with good intentions, impeding
implementation of sensible policies to curb unethical behavior.
At times, good intentions can result in unethical behavior.
Ethical reasoning drives ethical behavior
Ethical behavior is guided by deliberative reasoning based on
ethical principles.
Can induce policymakers to overestimate the eectiveness
of ethics training programs (standard in many organizations)
and underestimate the importance of contextual changes for
altering behavior.
76 behavioral science & policy | volume 3 issue 2 2017
Myth 2: Intentions Guide Ethical Actions
A more focused version of Myth 1 is the
common-sense assumption that actions are
caused by corresponding intentions: bad acts
stem from bad intentions, and good acts follow
from good intentions.12 Although intentions are
correlated with a person’s actions, the relation-
ship is far more complicated than intuitions
There are at least two consequences of over-
simplifying the relationship between actions
and intentions. First, people tend to overesti-
mate the power of their own good intentions
and, as a result, overestimate their propensity for
engaging in ethical behavior.13,14 People predict
that they will bravely confront instances of
racism, sexism, and physical abuse more often
than is realistic, as such predictions fall short of
the bravery people in the midst of those situ-
ations actually display.15–17 In one experiment,
for instance, 68% of women asked to anticipate
how they would respond to inappropriate job
interview questions posed by a male interviewer
(such as “Do you have a boyfriend?”) said they
would refuse to answer the questions, yet none
of the women did so when actually placed in
that situation.17
Second, good intentions can lead to unintended
unethical consequences simply because ancil-
lary outcomes are overlooked.18 People who
help a friend get a job with their employer,
for example, may fail to realize that this act of
ingroup favoritism also harms those outside
their social network.19 Harm can therefore be
done while intending to help.
Overestimating the power of good intentions
can impede sensible policies to curb unethical
behavior by causing people to dismiss institu-
tion safeguards as unnecessary. For instance,
surveys of doctors and financial planners find
that both groups think that conflict-of-interest
policies are necessary for other professions but
not for their own group.20 When people think
that they and their colleagues have good inten-
tions and that people in their profession can be
trusted to do what is right, they may unwisely
view ethical safeguards as onerous and useless.
Myth 3: Ethical Reasoning
Drives Ethical Behavior
Conventional wisdom suggests that ethical
reasoning causes ethical action, but behavioral
scientists routinely find that ethical reasoning
also follows from behavior—serving to justify,
rationalize, or explain behavior after it has
occurred.21,22 People generate sensible expla-
nations for choices they did not make,23 invent
post hoc arguments to justify prior choices,24
and evaluate evidence they want to believe
using a lower evidentiar y standard than they
apply to evidence they do not want to believe.25
To the extent that policymakers exaggerate the
causal power of ethical reasoning, they will also
likely overestimate the power of ethics training
programs (standard in many organizations)
to change behavior. Indeed, a survey of over
10,000 representative employees from six large
American companies found that the success of
ethics or compliance programs was driven more
by social norms within the organization than by
the content of these training programs.26
Collectively, these three myths matter because
they exaggerate the degree to which ethical
behavior is driven by beliefs and can therefore
be improved by instilling the right values and
intentions in people. Each of the myths contains
some element of truth—unethical values and
intentions can at times guide unethical behav-
iors, and reinforcing ethical principles has some
value. But these myths also oversimplify reality
in a way that can lead policymakers to overlook
other forces in a person’s immediate context
that shape ethical behavior. Policymakers who
realize that encouraging ethics is not just a belief
“Improving ethics often
requires altering the type of
situation a person is in, not
simply altering the type of
people in a given situation”
a publication of the behavioral science & policy association 77
problem but also a design problem can increase
ethical behavior by changing the contexts in
which people live and work. Here’s how.
Ethical Design for a
Human Mind
For systems to be eective, they must be tailored
to fit the properties of their users. Policies that
encourage ethical behavior should therefore
be designed around three basic psychological
processes that guide human behavior: attention,
construal, and motivation (see Table 2). That
is, policies should be designed to help people
keep ethical principles top of mind (attention),
encourage people to interpret and under-
stand the ethical ramifications of their behavior
(construal), and provide opportunities and
incentives to pursue ethical goals (motivation).
Attention: Make Ethics Top of Mind
Attention operates like a spotlight rather than
a floodlight, focusing on a small slice of all
possible relevant information. Because atten-
tion is limited, decisions are guided by whatever
information is most accessible at the time the
decision is made. An otherwise ethical person
might behave unethically simply by failing to
consider the ethical implications of his or her
The limited nature of attention implies that
designing environments to keep ethics top of
mind should increase the likelihood of ethical
behavior. In one eld experiment with a U.S.
automobile insurance company, customers
signed an honor code either before or after
completing a policy-review form that asked
them to repor t thei r current odom eter
mileage.27 Drivers reported the ir odometer
reading more honestly when they signed the
honor code before reporting their mileage. This
kind of simple design change keeps honesty top
of mind and can have a meaningful impact on a
person’s actions.28
An eective ethical system triggers people to
think about ethics routinely. Such systems can
include ethical checklists that are consulted
before making a decision,29 messaging that
makes ethical principles salient in the environ-
ment,30 or heuristics within an organization
that can become repeated mantras for ethical
action.31 Warren Buffett, for instance, asks
his employees to take the “front page test”
before making any important decision: “I want
employees to ask themselves whether they are
willing to have any contemplated act appear
the next day on the front page of their local
paper—to be read by their spouses, children
and friends—with the reporting done by an
Table 2. Ethical design principles
Ask the following questions when devising systems intended to foster ethical behavior.
Question Policy implication
Attention: Are ethics top of mind?
People have limited attention and are guided by information
that is accessible, or top of mind, at the time a decision is
made. People sometimes act unethically simply because they
fail to consider the ethical implications of their behavior.
Eective systems induce people to think about ethics routinely.
Examples of triggers include ethics checklists filled out before
making a decision, messages that make ethical principles
salient in the environment, or heuristics that can become
repeated mantras for ethical action.
Construal: Are people asking, “Is it right”?
How people behave is influenced by how they interpret—or
construe—their environment. Altering the construal of an event
can dramatically aect behavior by redefining what constitutes
appropriate conduct.
Ethical systems encourage ethical construals. Inducing
employees to ask themselves “Is it right?” rather than “Is
it legal?” should lead to an increase increase in prosocial
Motivation: Are you using prosocial goals?
Social incentives, such as a desire to help or connect with
others, can be used to motivate behaviors that naturally align
with ethical practices.
Systems that foster ethical behavior create opportunities for
people to do good for others and highlight the good that
others are doing to establish more ethical norms. Instead
of focusing on ethical failures, organizations should call out
ethical beacons—exemplary ethical behaviors—for others to
78 behavioral science & policy | volume 3 issue 2 2017
informed and critical reporter.”32 The key is to
make sure that ethics are brought to mind by
either well-learned heuristics or environmental
triggers at the very time that people are likely to
be contemplating an ethical decision.
Eective ethical systems can be contrasted with
environments that obscure ethical consider-
ations or chronically highlight goals that push
ethics out of mind. Enron, for instance, famously
had its stock price prominently displ ayed
throughout the company, including in its eleva-
tors , whereas it s mission statement, which
highlighted ethical principles, was unmemo-
rable, boilerplate, and prominently displayed
nowhere in the company.33
Construal: Encourage People
to Ask, “Is It Right?”
If you have ever watched a sporting event with
a fan of the opposing team, you know that two
people can witness the same event yet see very
dierent things. How people behave is a func-
tion of how they interpret—or construe—their
To understand the power of construal, consider
a simple experiment in which two partici-
pants play a simple economic game.34 In this
game, both players simultaneously choose
to cooperate or defect. Participants can earn
a moderate amount of money if both opt to
cooperate, but each player has the opportunity
to earn more by defecting; however, joint defec-
tion leaves both players worse o than if both
had cooperated. This task models a common
tension in real-world exchanges between coop-
eration and exploitation. Yet simply changing
the name of the game while keeping all other
aspects identical (including monetary payos)
had a dramatic impact on cooperation rates.
Roughly 30% of participants cooperated when
it was called the Wall Street Game, whereas 70%
cooperated when it was called the Commu-
nity Game. Although a name may seem like a
trivial detail, altering the construal of an event
can dramatically aect behavior by redefining
appropriate or expected conduct for oneself
and others.
At times, organizations seem to exploit the
power of construal to deter ethical behavior.
For instance, in the midst of serious vehicle
safety concerns at General Motors, company
representatives actively encouraged employees
to avoid ethical interpretations of the safety
issues when communicating with customers.
In one illustrative case, materials from a 2008
training se minar instructed employees o n
euphemisms to replace ethically relevant terms
when conversing with customers.35 Instead of
using the word safety, employees were to say,
“has potential safety implications.” Instead of
terms with clear moral implications, employees
were to use technical terminology, saying that
a product was “above specifications” or “below
specifications” rather than “safe” or “unsafe.”
Such instructions make it easier for employees
to construe their behavior in ways that permit
unethical behavior.
Fail in g to emphasize ethical construals is also
where well-intentioned programs meant to
ensure compliance with laws and regulations
can go wrong in organizations. These programs
usually focus on whether an action is legal or
illegal, not whether it is ethically right. Encour-
aging employees to ask themselves “Is it legal?
rather than “Is it right?” could inadvertently
promote unethical b ehavior. Andy Fastow,
former chief financial ocer of Enron, high-
lighted this disconnect when he looked back
on his own acts of accounting fraud: “I knew it
was wrong. . . . But I didn’t think it was illegal.
I thought: That’s how the game is played. You
have a complex set of rules, and the objec-
tive is to use the rules to your advantage.”36
As he remarked in a presentation, “The ques-
tion I should have asked is not what is the rule,
but what is the principle.”37 To foster ethical
behavior, systems need to encourage ethical
Motivation: Use Prosocial Goals
A truism of human behavior is that people do
what they are incentivized to do. The challenge
is to understand the specific goals that people
hold at any given time and use the right kinds of
incentives to shape behavior.
Percentage point increase
in people w
ho cooperate
in a game when its n ame
was changed from
“Wall Street G ame” to
“Communit y Game”
$3.08 for every $1
The lost mar ket value to
a firm fined for unethical
behavior re lative to the
fine is $3.08 for every $1
Drop in mine injuries
after requiring firms to
report safety records in
financial statements
a publication of the behavioral science & policy association 79
The most common approach to motivating
behavior, including ethical behav ior, is to
provide material incentives. Although financial
rewards and punishments can be productive
under the right circumstances, an approach
based on extrinsic incentives alone presumes
that people lack meaningful prosocial motiva-
tion to begin with: to be encouraged to behave
ethically, they must be compensated in some
way beyond having the satisfaction of doing the
right thing.
This presumption is often unwarranted. Proso-
cial motives, such as a desire to help or connect
with others, can be used to encourage behaviors
that naturally align with ethical practices. In one
experiment, fundraisers at a university alumni
call center worked significantly harder and raised
significantly more money after having a short
question-and-answer session with a benefi-
ciary.38 In another experiment, sales employees
performed better after receiving a bonus to be
spent on another member of their team than
they did after receiving a bonus meant to be
spent on themselves.39 Finally, aeld experi-
ment asking one group of managers to perform
random acts of kindness for employees over a
1-month period found significant reductions
in depression rates among these managers 4
months after the intervention ended.40
The importance of social motivation can also
be seen in the surprising power of social norms
to shape behavior. Behavioral science repeat-
edly demonstrates that people mostly conform
to what others around them are doing.41 This
insight can be used to motivate people for
good, to the extent that ethical norms are high-
lighted.42 For example, in an eort to increase
tax compliance, the UK Behavioral Insights Team
(at the time, a division of the British government
devoted to applying behavioral science to social
services) sent delinquent taxpayers letters with
dierent messages encouraging them to pay
their taxes. The most eective letter was the
one informing individuals that “Nine out of ten
people in the UK pay their tax on time. You are
currently in the very small minority of people
who have not paid us yet.”43
The power of social norms in shaping ethical
behavior has an important implication. Discus-
sions about ethics often focus on unethical
behavior—on crimes an d ot he r unethical
things people are doing. Such discussions are
like black holes, attracting people to them and
potentially encouraging similar behavior. What
is more constructive is to focus on ethical
beacons—example s of admirable behavior
amo ng individuals, group s, or co mpanies.
Public service announ ceme nt s, company
newsletters, and other sources of information
intended to encourage ethical behavior should
call out exemplary ethical behavior that others
can strive to emulate. To foster ethical behavior,
then, policymakers should create opportunities
for people to do good for others and should
establish ethical norms by highlighting the good
that others are already doing.
An Ethical Organization,
by Design
An ethical system is an environment designed
to keep ethics top of mind, make ethics central
to the framing of policies and initiatives, and
increase prosocial motivation. Design details
must be guided by an organization’s mission
and by a well-crafted mission statement that
features a small number of key principles. Prac-
tices, in turn, should be aligned with the stated
principles as part of an organization’s strategy
for success. These principles must go beyond
maximizing short-term shareholder value to
focus, instead, on enabling long-term sustain-
ability of the entity and its ethical actions.
Of course, policy changes inspired by an organi-
zation’s core values will not produce a perfectly
ethical organization, just as a well-designed
“sales employees performed better after receiving a bonus to
be spent on another member of their team than they did after
receiving a bonus meant to be spent on themselves”
80 behavioral science & policy | volume 3 issue 2 2017
bridge based on fundamental engineering prin-
ciples cannot eliminate all safety risks. Ethical
systems are intended to create the kind of envi-
ronment that makes ethical behavior easier and
therefore more frequent. At a practical level,
policymakers can incorporate ethical design
principles into the major drivers of behavior
within their organizations: procedures for hiring
and compensating employees, maintaining the
entity’s reputation, and carrying out day-to-day
Interviews are typically meant to identify the
best person for a job, although their ability to
do so is notoriously limited.44,45 Interviews and
onboarding procedures can, however, also serve
as an acculturation tool that communicates an
organization’s ethical values to prosp ective
employees and highlights the importance of
those values to current employees.
Interviews can be designed around ethics by
asking questions that make an organization’s
commitment to ethics clear to prospective
employees. Johnson & Johnson, for instance,
has a number of questions relating to its
well-known credo (which pledges to priori-
tize the needs of the people it serves) that are
put to potential employees during the inter-
view process. For example, when discussing
the company’s commitment to customers,
interviewers may ask potential employees to
describe a time they identified and addressed
an unmet customer need. Interviews designed
around an organization’s principles, including its
ethical principles, can bring ethics to everyone’s
attention, encourage construal of behavior in
terms of ethical principles, and signal that the
organization considers ethical behavior to be
an important source of motivation for both
current and new employees. Even though job
interviews may be poor tools for identifying and
selecting the right employees, they can be used
to communicate a company’s values at a critical
point in an employee’s acculturation process.
An organization that has its representatives
ask about ethics during an interview signals its
concern for ethics on the job.
Organizations can design finan cial reward
systems to encourage ethical behavior in two
dierent ways. First, organizations can reward
ethical behavior dire ctly, such as th rough
scorecards that translate ethical values into
measurable actions. Southwest Airlines, for
instance, designs its executive compensation
scorecard around the company’s four primary
values. To reward executives for upholding
the value “Every Employee Matters,” the airline
compensates them for low voluntary turnover.
By linking compensation to keeping employees
at the company, Southwest tries to create an
incentive for bosses to contribute to a valuable
prosocial outcome.
Second, organizations can provide opportunities
for employees to satisfy preexisting prosocial
motivations. People tend to feel good when
they are also doing good for others,46,47 and
they also do good to maintain a positive repu-
tation in the eyes of others.48 Organizations can
provide opportunities to satisfy both motives
by allowing employees to reward one another,
by facilitating random acts of kindness, or by
oering employees time to engage in proso-
cially rewarding work that is aligned with the
organization’s values. In one field experiment,
Virgin Atlantic rewarded its pilots for achieving
a fuel-eciency goal by giving a relatively small
amount of money to the pilot’s chosen charity.49
This proso cial incentive increased pilots’
reported job satisfaction by 6.5% compared with
the pilots in the control condition, an increase
equivalent to the observed dierence in job
satisfaction between those who are in poor
health and those who are in good health. The
good news for organizations and policymakers
is that these prosocial incentives usually cost
little or nothing and yet can have meaningful
eects on well-being and behavior.
Reputation Management
People, including those who run organizations,
care about their reputation in the eyes of others,
because that reputation aects how they are
treated. In one economic analysis, compa-
nies fined by the U.S. Securities and Exchange
Commission for unethical behavior lost $3.08
in market share for every $1 they were fined,
a publication of the behavioral science & policy association 81
with these larger losses coming from the repu-
tational consequences of being identified as a
lawbreaker.50 Policymakers who are designing
ethical systems can capitalize on the reputa-
tional concerns of companies and employees
to foster ethical behavior. For instance, they
can ensure that an organization’s reputation is
measured and that the results are public and
At the individual level, many organizations
already conduc t annual climate or culture
surveys that can be used to measure percep-
tions of ethical behavior within the organization.
Behavioral science suggests that reporting
these ethical evaluations within the organiza-
tion or using them as part of the performance
review process is likely to increase ethical
behavior among employees, so long as making
unfounded accusations can be minimized (such
as when an independent agency monitors
The public sector can also implement policies
that enhance corporate ethics. Policies that
mandate public disclosure of companies’ prac-
tices often directly improve ethical behavior
across an entire industr y. For example, the
Ministry of Environment, Lands and Parks of
British Columbia, Canada, publishes a list of
firms that have failed to comply with existing
regulations. An empirical analysis found that
publishing this list of polluters had a larger
impact on subsequent emissions levels and
compliance status than did fines and penalties
associated with noncompliance.51,5 2
Similarly, publishing workplace safety records,
thus making them m ore noti ceab le, can
produce significant decreases in workplace inju-
ries. One analysis found that a new requirement
to report mine-safety records in financial state-
ments produced an 11% drop in mine-related
citations and a 13% drop in injuries.53 Reputation
systems have also been eective at increasing
hygienic standards at restaurants54 and adher-
ence to clean drinking water standards by utility
companies:55 In Los Angeles, hygiene grading
cards have caused restaurants to make hygiene
improvements, and, in Massachusetts, requiring
community water suppliers to inform consumers
of violations of drinking-water regulations led to
a reduction in violations. Policymakers typically
focus on financial or legal incentives to shape
behavior, but clearly reputational concerns can
serve as a third powerful class of incentives.
Designed properly, daily operations can also
oer opportunities to reinforce ethical values by
keeping ethical considerations top of mind and
making it easier to behave ethically. These goals
can be facilitated by using organizational prac-
tices that compensate for cognitive limitations
(that is, cognitive repairs), such as reminders,
checklists, and visible statements relating to
personal responsibility.56–59
These cognitive repairs must be timely to be
effective, bringing ethical considerations to
mind at the time a person is making a decision
with ethical implications. One field experiment
highlights the importance of timeliness. In this
study, hotel valets either reminded drivers to
wear their seat belt when the valet ticket was
turned in (about a 6-minute delay), reminded
drivers to wear their seat belt as they entered the
car, or provided no reminder at all.60 Only the
immediate reminders had a noticeable impact
on behavior. Drivers who received the reminder
6 minutes before starting their car were no
more likely to fasten their seat belts than were
drivers who received no reminder at all.
Cognitive repairs must also make the ethical
consequences of one’s actions obvious. In one
series of experiments, researchers found that
“publishing this list of
polluters had a larger impact
on subsequent emissions
levels and compliance
status than did fines and
penalties associated with
82 behavioral science & policy | volume 3 issue 2 2017
physicians were more likely to follow a stan-
dard handwashing protocol when signs at the
handwashing stations reminded them about the
consequences for patient safety (“Hand hygiene
prevent s patients from c atching diseases”),
compared with signs that provided instructions
for handwashing or emphasized personal safety
(“Hand hygiene prevents you from catching
diseases”).61 The goal of these design solutions is
to create an environment where ethical consid-
erations are such a routine part of day-to-day
interactions that they become automatic habits
ingrained in the organization’s cultural practices.
In writing about the 2007–2008 financial crisis,
New Yorker reporter John Cassidy noted that he
angered some people by suggesting that
... [the] Wall Street C.E.O.s involved in the
run-up to the financial crisis were “neither
sociopaths nor idiots nor felons. For the
most part, they are bright, industrious, not
particularly imaginative Americans who
worked their way up, cultivated the right
people, performed a bit better than their
colleagues, and found themselves occu-
pying a corner oce during one of the
great credit booms of all time.”62
That this statement angered so many people
illustrates how conventional wisdom of ten
treats ethics as a belief problem: that unethical
behavior is caused by individuals with unethical
values or intentions.
However, the empirical evidence paints a more
complicated picture: Unethical behavior is also
caused by momentary thoughts, interpretations,
and social context. As a result, a more accurate
and constructive approach for policymakers is
to treat ethical behavior as a design problem.
Designing environments that keep ethics top
of mind, encourage ethical construals, and
strengthen prosocial motivations is essential for
helping to keep otherwise good people from
doing bad things.
author aliation
Epley: University of Chicago. Tannenbaum:
Universit y of Utah. Corresponding author’s
a publication of the behavioral science & policy association 83
1. Government Accountability Oce.
(2013). Aviation security: TSA should
limit future funding for behavior
detection activities (GAO Publication
No. 14-158T). Washington, DC: U.S.
Government Printing Oce.
2. Epley, N., & Gneezy, A. (2007). The
framing of financial windfalls and
implications for public policy. Journal of
Socio-Economic s, 36, 36–47.
3. Shapiro, M. D., & Slemrod, J. (2003).
Consumer response to tax rebates.
American Economic Review, 93,
4. Epley, N., Mak, D., & Idson, L. C. (2006).
Bonus or rebate? The impact of income
framing on spending and saving.
Journal of Behavioral Decision Making,
19, 213–227.
5. Gilber t, D. T., & Malone, P. S. (1995). The
correspondence bias. Psychological
Bulletin, 117, 21–38.
6. Milgram, S. (1965). Some conditions
of obedience and disobedience to
authority. Human Relations , 18(1),
7. Vosgerau, J. (2016). Accuracy of morality
judgements. Working paper, Bocconi
University, Milan, Italy.
8. Maruna, S., & King, A . (2009).
Once a criminal, always a criminal?
“Redeemability” and the psychology
of punitive public attitudes. European
Journal on Criminal Policy and
Research, 15, 7–24.
9. Bernerth, J. B., Taylor, S. G., Walker,
H. J., & Whitman, D. S. (2012). An
empirical investigation of dispositional
antecedents and performance-related
outcomes of credit scores. Journal of
Applied Psychology, 97, 469–478.
10. Bryan, L . K., & Palmer, J. K. (2012).
Do job applicant credit histories
predict performance appraisal
ratings or termination decisions? The
Psychologist-Manager Journal, 15,
11. Hartshorne, H., & May, M. A. (1928).
Studies in the nature of character:
I. Studies in deceit. New York, NY:
12. Baron, J., & Hershey, J. C. (1988).
Outcome bias in decision evaluation.
Journal of Per sonality and Social
Psychology, 54, 569–579.
13. Epley, N., & Dunning, D. (2000). Feeling
“holier than thou”: Are self-serving
assessments produced by errors in
self or social prediction? Journal of
Personalit y and Social Psychology, 79,
14. Epley, N., & Dunning, D. (2006). The
mixed blessings of self-knowledge
in behavioral prediction: Enhanced
discrimination but exacerbated bias.
Personalit y and Social Psychology
Bulletin, 32, 641–655.
15. Bocchiaro, P., Zimbardo, P. G., & Van
Lange, P. A. M. (2012). To defy or not
to defy: An experimental study of the
dynamics of disobedience and whistle-
blowing. Social Influence, 7, 35–50.
16. Kawakami, K ., Dunn, E., Karmali, F.,
& Dovidio, J. F. (2009, Januar y 9).
Mispredicting aective and behavioral
responses to racism. Science, 323,
17. Woodzicka, J. A., & LaFrance, M.
(2001). Real versus imagined gender
harassment. Journal of Social Issues , 57,
18. Chugh, D., Banaji, M. R., & Bazerman,
M. H. (2005). Bounded ethicality as a
psychological barrier to recognizing
conflicts of interest. In D. A. Moore,
D. M. Cain, G. Loewenstein, & M. H.
Bazerman (Eds.), Conflicts of interest:
Problems and solutions from law,
medicine and organizational settings
(pp. 74–95). London, United Kingdom:
Cambridge Universit y Press.
19. Bazerman, M. H., & Tenbrunsel, A . E.
(2012). Blind spots: Why we fail to do
what’s right and what to do about it.
Princeton, NJ: Princeton University
20. Sharek, Z., Schoen, R. E., &
Loewenstein, G . (2012). Bias in the
evaluation of conflict of interest policies.
The Journal of Law, Medicine & Ethics,
40, 368–382.
21. Haidt, J. (2001). The emotional dog
and its rational tail: A social intuitionist
approach to moral judgment.
Psychological Review, 108, 814–834.
22. Ditto, P. H., Pizarro, D. A., &
Tannenbaum, D. (2009). Motivated
moral reasoning. In D. M. Bartels, C. W.
Bauman, L. J. Skitka, & D. L. Medin (Eds.),
Psychology of learning and motivation:
Vol. 50. Moral judgment and decision
making (pp. 307–338). San Diego, CA:
Academic Press.
23. Johansson, P., Hall, L., Sikström, S., &
Olsson, A. (2005, October 7). Failure to
detect mismatches between intention
and outcome in a simple decision task.
Science, 310, 116–119.
24. Haidt, J., Bjorklund, F., & Murphy, S.
(2000). Moral dumbfounding: When
intuition finds no reason. Unpublished
manuscript, University of Virginia,
25. Dawson, E., Gilovich, T., & Regan, D.
T. (2002). Motivated reasoning and
performance on the Wason selection
task. Personality and Social Psychology
Bulletin, 28, 1379–1387.
26. Treviño, L. K., Weaver, G. R., Gibson,
D. G., & Toer, B. L. (1999). Managing
ethics and legal compliance: What
works and what hurts. C alifornia
Management Review, 41(2), 131–151.
27. Shu, L. L., Mazar, N., Gino, F., Ariely,
D., & Bazerman, M. H. (2012). Signing
at the beginning makes ethics salient
and decreases dishonest self-reports
in comparison to signing at the end.
Proceedings of the National Academy of
Sciences, USA, 109, 15197–15200.
28. Congdon, W. J., & Shankar, M. (2015).
The White House Social & Behavioral
Sciences Team: Lessons learned from
year one. Behavioral Science & Policy,
1(2), 77–86.
29. Gawande, A ., & Lloyd, J. B. (2010). The
checklist manifesto: How to get things
right. New York, NY: Metropolitan
30. Meeker, D., Knight, T. K., Friedberg, M.
W., Linder, J. A., Goldstein, N. J., Fox,
C. R., . . . Doctor, J. N. (2014). Nudging
guideline-concordant antibiotic
prescribing: A randomized clinical trial.
JAMA Internal Medicine, 174, 425–431.
31. Heath, C., Larrick, R. P., & Klayman,
J. (1998). Cognitive repairs:
How organizational practices
can compensate for individual
shortcomings. Research in
Organizational Behavior, 20, 1–37.
32. Berkshire Hathaway. (n.d.). Berkshire
Hathaway Inc. code of business conduct
and ethics. Retrieved May 25, 2017, from
33. McLean, B., & Elkind, P. (2003). The
smartest guys in the room: The amazing
rise and scandalous fall of Enron. New
York, NY: Portfolio.
34. Liberman, V., Samuels, S. M., & Ross,
L. (2004). The name of the game:
Predictive power of reputations versus
situational labels in determining
prisoner’s dilemma game moves.
Personalit y and Social Psychology
Bulletin, 30, 1175–1185.
35. United States Department of
Transportation, National Highway Trac
Safety Administration. (2014, May 16).
Consent Order TQ14-001: In re: NHTSA
Recall No. 14V-047. Retrieved from
84 behavioral science & policy | volume 3 issue 2 2017
36. Elkind, P. (2013, July 1). The confessions
of Andy Fastow. Fortune. Retrieved
37. Jae, M. (2012, March 19). Andrew
Fastow draws on Enron failure in
speech on ethics at CU. The Denve r
Post. Retrieved from http://www.
38. Grant, A. M., Campbell, E. M., Chen,
G., Cottone, K ., Lapedis, D., & Lee, K.
(2007). Impact and the art of motivation
maintenance: The eects of contact
with beneficiaries on persistence
behavior. Organizational Behavior and
Human Decision Processes, 103, 53–67.
39. Anik, L., Aknin, L. B., Norton, M. I., Dunn,
E. W., & Quoidbach, J. (2013). Prosocial
bonuses increase employee satisfaction
and team performance. PloS One, 8(9),
Article e75509. Retrieved from https://
40. Chancellor, J., Margolis, S., &
Lyubomirsky, S. (2017). The propagation
of everyday prosociality in the
workplace. The Journal of Positive
Psychology. Advance online publication.
41. Cialdini, R . B., & Goldstein, N. J.
(2004). Social influence: Compliance
and conformity. Annual Review of
Psychology, 55, 591–621.
42. Nolan, J. M ., Schultz, P. W., Cialdini, R.
B., Goldstein, N. J., & Griskevicius, V.
(2008). Normative social influence is
underdetected. Personality and Social
Psychology Bulletin, 34, 913–923.
43. Hallsworth, M., List, J., Metcalfe, R.,
& Vlaev, I. (2014). The behavioralist
as tax collector: Using natural field
experiments to enhance tax compliance
(NBER Working Paper No. 20007).
Cambridge, MA: National Bureau of
Economic Research.
44. Wright, P. M., Lichtenfels, P. A., &
Pursell, E. D. (1989). The structured
interview: Additional studies and a
meta-analysis. Journal of O ccupational
Psychology, 62, 191–199.
45. McDaniel, M. A., Whetzel, D. L.,
Schmidt, F. L., & Maurer, S. D. (1994).
The validit y of employment interviews:
A comprehensive review and meta-
analysis. Journal of Applied Psychology,
79, 599–616.
46. Andreoni, J. (1990). Impure altruism
and donations to public goods: A theory
of warm-glow giving. The Economic
Journal, 100, 464–477.
47. Dunn, E. W., Aknin, L. B., & Norton, M. I.
(2008, March 21). Spending money on
others promotes happiness. Science,
319, 1687–1688.
48. Cain, D. N., Dana, J., & Newman, G. E.
(2014). Giving versus giving in. Academy
of Management Annals , 8, 505–533.
49. Gosnell, G. K., List, J. A., & Metcalf, R. D.
(2017). A new approach to an age-old
problem: Solving externalities by
incenting workers directly (E2e Working
Paper 027). Retrieved from E2e website:
50. Karpo, J. M., Lee, D. S., & Martin, G.
S. (2008). The cost to firms of cooking
the books. Journal of Financial and
Quantitative A nalysis, 43, 581–612.
51. Foulon, J., Lanoie, P., & Laplante, B.
(2002). Incentives for pollution control:
Regulation or information? Journal
of Environmental Economics and
Management, 44, 169–187.
52. Konar, S., & Cohen, M. A. (1997).
Information as regulation: The eect of
community right to know laws on toxic
emissions. Journal of Environmental
Economics and Management, 32,
53. Christensen, H. B., Floyd, E., Liu, L.
Y., & Maett, M. G. (2017). The real
eects of mandated information on
social responsibility in financial reports:
Evidence from mine-safety records.
Retrieved from SSRN website: https://
54. Jin, G. Z., & Leslie, P. (2003). The eect
of information on product quality:
Evidence from restaurant hygiene
grade cards. The Quar terly Journal of
Econom ics , 118, 409–451.
55. Bennear, L. S., & Olmstead, S. M. (2008).
The impacts of the “right to know”:
Information disclosure and the violation
of drinking water standards. Journal
of Environmental Economics and
Management, 56, 117–130.
56. Heath, C., Larrick, R. P., & Klayman,
J. (1998). Cognitive repairs:
How organizational practices
can compensate for individual
shortcomings. Research in
Organizational Behavior, 20, 1–37.
57. Haynes, A. B., Weiser, T. G., Berry, W. R.,
Lipsitz, S. R., Breizat, A. H. S., Dellinger,
E. P., . . . Gawande, A. A . (2009). A
surgical safety checklist to reduce
morbidity and mortality in a global
population. New England Journal of
Medicine, 360, 491–499.
58. Rogers, T., & Milkman, K. L. (2016).
Reminders through association.
Psychological Science, 27, 973–986.
59. Zhang, T., Fletcher, P. O., Gino, F., &
Bazerman, M. H. (2015). Reducing
bounded ethicality: How to help
individuals notice and avoid unethical
behavior. Organizational Dynamics, 44,
60. Austin, J., Sigurdsson, S. O., & Rubin, Y.
S. (2006). An examination of the eects
of delayed versus immediate prompts
on safety belt use. Environme nt and
Behavior, 38, 140–149.
61. Grant, A. M., & Hofmann, D. A. (2011).
It’s not all about me: Motivating
hand hygiene among health care
professionals by focusing on patients.
Psychological Science, 22, 1494–1499.
62. Cassidy, J. (2013, August 5). Wall
Street after Fabulous Fab: Business as
usual. The New Yorker. Retrieved from
... In this setting, we study how a simple intervention-asking individuals to make a nonbinding statement about the payment they should receive ex ante, before facing the temptation of taking excessive pay from the bowl-affects self-serving behavior. Such an intervention is potentially effective in preventing individuals from violating their own moral/fairness standards for two main reasons: First, prior work has suggested that making norms of what is fair or ethical top of mind could raise attention to moral standards (Mazar et al. 2008, Epley andTannenbaum 2017). Asking people to consider their actions in advance, before facing temptation, may increase the salience of fairness/ethical considerations over self-interest (Tenbrunsel et al. 2010), helping them resist "want" choices (Milkman et al. 2008). ...
Full-text available
Morals constrain self-serving behavior. Yet, self-regulation failures in the face of monetary temptation are common at the workplace. To limit such failures, organizations can design environments that limit the temptation to behave self-servingly, nudging workers to uphold their morals. In a series of experiments where participants may be tempted to take excessive pay after exerting effort, we study whether a simple intervention—asking individuals to state the wage they believe should be paid ex ante, before facing the temptation to take excessive compensation—prevents self-serving behavior. In contrast to lay beliefs and the predictions from prior work, we find that such an intervention is not effective, leading to self-serving behavior. However, a more realistic elicitation procedure of the appropriate wage mitigates this effect. These findings contribute to work on the malleability of moral behavior showing that simple interventions thought to effectively mitigate self-serving behavior can prompt individuals to stretch their moral boundaries. They also stress the importance of properly testing interventions that might seem intuitive. This paper was accepted by Yan Chen, behavioral economics and decision analysis.
... Many organizations have tried to cultivate an ethical culture to encourage ethical behaviors by their employees (Epley and Tannenbaum 2017). In organizations whose management aims to create a unitary organizational culture, employees are often exposed to a clear articulation of the organizational emphasis, typically propagated as a credo or corporate code of ethics. ...
Full-text available
During the professional socialization process, nascent professionals internalize the moral values of their profession. Since professional socialization begins in professional schools, this paper provides a new conceptual framework for professional ethics education which highlights the affective aspects of moral formation. To create the conceptual framework, this paper synthesizes the ideas of Durkheim, Kohlberg, Hoffman, and Haidt on moral formation, with Durkheim as a common thread. In this conceptual framework, the internalization process is influenced and promoted by social discipline, which includes both cognitive and affective aspects. Desirable social discipline can be achieved when cognition and affect are well-balanced, with respect for individual differences. To illustrate how this conceptual framework can be applied to professional education, this paper uses the specific example of engineering ethics education.
... The psychological tendencies above can override good intentions and careful judgment, allowing people to see themselves as more ethical than they actually are (ethicality bias). Indeed, overestimating good intentions may encourage employees to see ethics policies, including the oath, as unnecessary (Epley & Tannenbaum, 2017). ...
Full-text available
The practice of making oaths comes from ancient times, a tradition common to virtually all peoples and cultures. Recent calls for ethics reform have included questions about how or whether these declarations are honored. In the fraught politics of today’s secularized, pluralistic society, skepticism about oaths may be tempting, but it is insufficient as the topic deserves critical reflection. This study assesses the efficacy of oaths of office by examining them using intellectual, aesthetic, moral, and spiritual transcendental values that define excellence. The analysis offers recommendations to reinforce the significance of this once-venerable bond between the populace and public servants.
Full-text available
Be it child abuse in the Catholic Church, manipulations of transplant lists or the current Dieselgate cases. Scandals repeatedly shake our trust in organizations. Against the backdrop of the devastation that rule-breaking in and by organizations leaves in our society, it is necessary to address how wrongdoings and crimes are regularly swept under the organizational carpet. To this end, the present anthology brings together contributions of international researchers from various disciplines to explain how organizational wrongdoing can flourish in the shadow of organizational silence despite tightening regulations, and increased efforts of compliance and prosecution.
Full-text available
Our contribution analytically differentiates between individual and organizational deviance on the one hand and individual and organizational silence on the other. The combination of both analytical categorizations offers the possibility of building archetypes or idealtypes as to how silence and wrongdoing can be interconnected. Based on this heuristic, we analyze the case of the “German Transplant Scandal”. The analysis supports our assumption that it is central to understand silence in organizations not as isolated but always in the context of the kind of wrongdoings it covers up. The case analysis shows that the informal norms which structured the organizational deviance also influenced the corresponding dynamic of silence. Against this background, we argue that the current research focusses too heavily on identifying case independent factors with the help of quantitative research designs and that a qualitative case perspective is needed to understand more deeply the phenomenon of silence in organizations.
The practice of making oaths comes from ancient times, a tradition common to nearly all peoples and cultures. Calls for ethics reform in recent years have included questions about how or whether oaths are honored. In the fraught politics of today’s secularized, pluralistic society, skepticism about oaths may be warranted, but it is insufficient as the topic deserves critical reflection. The purpose of this exploratory study is to assess the efficacy of oaths of office. This is accomplished by subjecting them to both philosophical normative ethics and contemporary behavioral ethics examination. The analysis concludes by offering recommendations to reinforce the significance of this once-venerable bond between the populace and public servants.
How can schools help students build moral character? One way is to use prepackaged moral education programs, but as we report here, their effectiveness tends to be limited. What, then, can schools do? We took two steps to answer this question. First, we consulted more than 50 of the world’s leading social scientists. These scholars have spent decades studying morality, character, or behavior change but until now few had used their expertise to inform moral education practices. Second, we searched recent studies for promising behavior change techniques that apply to school-based moral education. These two lines of investigation congealed into two recommendations: Schools should place more emphasis on hidden or “stealthy” moral education practices and on a small set of “master” virtues. Throughout the article, we describe practices flowing from these recommendations that could improve both the effectiveness and efficiency of school-based moral education.
Full-text available
Prosocial behaviors typically benefit those who perform them but can create mixed emotions in recipients. Yet, how does prosociality affect the well-being of those who merely observe it? The current study aimed to answer this question by experimentally prompting employees to perform prosocial acts at work (Givers), be the recipient of such acts (Receivers), or to do neither (Observers). Our focus was on everyday prosociality, which involves kind acts directed at those in one’s social circle, rather than at individuals in need. Social proximity to Givers, but not Receivers, positively predicted boosts in well-being. Indeed, social proximity to Receivers was associated with a nonsignificant trend toward decreased well-being. However, both social proximity to Givers and social proximity to Receivers predicted increases in prosocial behavior among Observers. These results suggest that prosocial behavior and its emotional benefits propagate through social networks, particularly for those in close social proximity to prosocial actors.
Full-text available
This meta-analytic review presents the findings of a project investigating the validity of the employment interview. Analyses are based on 245 coefficients derived from 86,311 individuals. Results show that interview validity depends on the content of the interview (situational, job related, or psychological), how the interview is conducted (structured vs. unstructured; board vs. individual), and the nature of the criterion (job performance, training performance, and tenure; research or administrative ratings). Situational interviews had higher validity than did job-related interviews, which, in turn, had higher validity than did psychologically based interviews. Structured interviews were found to have higher validity than unstructured interviews. Interviews showed similar validity for job performance and training performance criteria, but validity for the tenure criteria was lower.
Research on moral judgment has been dominated by rationalist models, in which moral judgment is thought to be caused by moral reasoning. The author gives 4 reasons for considering the hypothesis that moral reasoning does not cause moral judgment; rather, moral reasoning is usually a post hoc construction, generated after a judgment has been reached. The social intuitionist model is presented as an alternative to rationalist models. The model is a social model in that it deemphasizes the private reasoning done by individuals and emphasizes instead the importance of social and cultural influences. The model is an intuitionist model in that it states that moral judgment is generally the result of quick, automatic evaluations (intuitions). The model is more consistent than rationalist models with recent findings in social, cultural, evolutionary, and biological psychology, as well as in anthropology and primatology.
This paper presents results from two large-scale natural field experiments that tested the effect of social norm messages on tax compliance. Using administrative data from > 200,000 individuals in the United Kingdom, we show that including social norm messages in standard reminder letters increases payment rates for overdue tax. This result offers a rare example of social norm messages affecting tax compliance behavior in a real world setting. We find no evidence that loss framing is more effective than gain framing. Descriptive norms appear to be more effective than injunctive norms. Messages referring to public services or financial information also significantly increased payment rates. The field experiments accelerated the collection of tax revenue at little cost.
People often fail to follow through on good intentions. While limited self-control is frequently the culprit, another cause is simply forgetting to enact intentions when opportunities arise. We introduce a novel, potent approach to facilitating follow-through: the reminders-through-association approach. This approach involves associating intentions (e.g., to mail a letter on your desk tomorrow) with distinctive cues that will capture attention when you have opportunities to act on those intentions (e.g., Valentine’s Day flowers that arrived late yesterday, which are sitting on your desk). We showed that cue-based reminders are more potent when the cues they employ are distinctive relative to (a) other regularly encountered stimuli and (b) other stimuli encountered concurrently. Further, they can be more effective than written or electronic reminder messages, and they are undervalued and underused. The reminders-through-association approach, developed by integrating and expanding on past research on self-control, reminders, and prospective memory, can be a powerful tool for policymakers and individuals.
Today we find ourselves in possession of stupendous know-how, which we willingly place in the hands of the most highly skilled people. But avoidable failures are common, and the reason is simple: the volume and complexity of our knowledge has exceeded our ability to consistently deliver it - correctly, safely or efficiently. In this groundbreaking book, Atul Gawande makes a compelling argument for the checklist, which he believes to be the most promising method available in surmounting failure. Whether you're following a recipe, investing millions of dollars in a company or building a skyscraper, the checklist is an essential tool in virtually every area of our lives, and Gawande explains how breaking down complex, high pressure tasks into small steps can radically improve everything from airline safety to heart surgery survival rates. Fascinating and enlightening, The Checklist Manifesto shows how the simplest of ideas could transform how we operate in almost any field.
When confronted with an ethical dilemma, most of us like to think we would stand up for our principles. But we are not as ethical as we think we are. InBlind Spots, leading business ethicists Max Bazerman and Ann Tenbrunsel examine the ways we overestimate our ability to do what is right and how we act unethically without meaning to. From the collapse of Enron and corruption in the tobacco industry, to sales of the defective Ford Pinto and the downfall of Bernard Madoff, the authors investigate the nature of ethical failures in the business world and beyond, and illustrate how we can become more ethical, bridging the gap between who we are and who we want to be.Explaining why traditional approaches to ethics don't work, the book considers how blind spots like ethical fading--the removal of ethics from the decision--making process--have led to tragedies and scandals such as the Challenger space shuttle disaster, steroid use in Major League Baseball, the crash in the financial markets, and the energy crisis. The authors demonstrate how ethical standards shift, how we neglect to notice and act on the unethical behavior of others, and how compliance initiatives can actually promote unethical behavior. Distinguishing our "should self" (the person who knows what is correct) from our "want self" (the person who ends up making decisions), the authors point out ethical sinkholes that create questionable actions.Suggesting innovative individual and group tactics for improving human judgment,Blind Spotsshows us how to secure a place for ethics in our workplaces, institutions, and daily lives.
We examine the penalties imposed on the 585 firms targeted by SEC enforcement actions for financial misrepresentation from 1978-2002, which we track through November 15, 2005. The penalties imposed on firms through the legal system average only $23.5 million per firm. The penalties imposed by the market, in contrast, are huge. Our point estimate of the reputational penalty-which we define as the expected loss in the present value of future cash flows due to lower sales and higher contracting and financing costs - is over 7.5 times the sum of all penalties imposed through the legal and regulatory system. For each dollar that a firm misleadingly inflates its market value, on average, it loses this dollar when its misconduct is revealed, plus an additional $3.08. Of this additional loss, $0.36 is due to expected legal penalties and $2.71 is due to lost reputation. In firms that survive the enforcement process, lost reputation is even greater at $3.83. In the cross section, the reputation loss is positively related to measures of the firm's reliance on implicit contracts. This evidence belies a widespread belief that financial misrepresentation is disciplined lightly. To the contrary, reputation losses impose substantial penalties for cooking the books. Copyright 2008, Michael G. Foster School of Business, University of Washington, Seattle, WA 98195.
This survey of employees at six large American companies asked the question: "What works and what hurts in corporate ethics/compliance management?" The study found that a values-based cultural approach to ethics/compliance management works best. Critical ingredients of this approach include leaders' commitment to ethics, fair treatment of employees, rewards for ethical conduct, concern for external stakeholders, and consistency between policies and actions. What hurts effectiveness most are an ethics/compliance program that employees believe exists only to protect top management from blame and an ethical culture that focuses on unquestioning obedience to authority and employee self-interest. The results of effective ethics/compliance management are impressive. They include reduced unethical/illegal behavior in the organization, increased awareness of ethical issues, more ethical advice seeking within the firm, greater willingness to deliver bad news or report ethical/legal violations to management, better decision making because of the ethics/compliance program, and increased employee commitment.