Transcript Slide 1
Used by permission
Ethical Decision Making:
Heuristics and Biases
William J. Wilhelm
College of Business
Indiana State University
The Four Components of Moral Behavior
(Rest et al, 1999)
1.
Moral sensitivity
2.
Moral judgment
3.
Moral motivation
4.
Moral character
Steps in making a judgment
1.
Problem recognition
2.
Identification of alternative courses of
action
3.
Evaluation of alternative courses of action
4.
Estimation of outcome probabilities
5.
Calculation of expected values
6.
Justification of course of action chosen
BUSINESS Evaluation Tools.
For example, in management
decisions we use tools such as:
cost-benefit analysis
feasibility analysis
time-to-market analysis
net present value
strategic prioritization
etc.
ETHICAL Evaluation Tools
Conventional moral rules and codes
The Golden Rule, laws, corporate codes of
ethics, etc.
Universal duty towards others
Kant’s categorical imperative
Greatest good for the greatest number
Bentham & Mill’s utilitarianism
Characteristics of a good person
Aristotle’s virtue theory: bravery, honesty,
temperance, generosity, justice, pride.
Steps in making a judgment
•Conventional rules and laws
1. Problem recognition
•Categorical imperative
•Utilitarianism
2. Identification•Virtue
of alternative
courses
theory
action
of
3.
Evaluation of alternative courses of action
4.
Estimation of outcome probabilities
5.
Calculation of expected values
6.
Justification of course of action chosen
Steps in making a judgment
1.
Problem recognition
2.
Identification of alternative courses of
action
3.
Evaluation of alternative courses of action
4.
Estimation of outcome probabilities
5.
Calculation of expected values
6.
Justification of course of action chosen
Rational Actors?
Optimal Decision-Making Model?
People are plagued more by bad decision
making than ethical breaches in reasoning.
Cognitive and behavioral susceptibilities might
lead (often unwittingly) to unethical decision
making.
Overwhelming evidence that people do not
always make decisions in a rationally optimal
manner (Kahneman & Tversky, 2000).
Various heuristics and biases lead most people
to systematically diverge from optimal
decision-making.
Conflicting values
Individual
Social
Religious
Organizational
Cultural
Other
Biases and heuristics that can
cloud ethical decision making
Obedience to authority
Process
Social proof
Cognitive dissonance
False consensus effect
Sunk costs
Over optimism
Overconfidence
Self-serving bias
Time-delay traps
Framing
Loss aversion
The tangible and the
abstract
From: Teaching ethics, heuristics, and biases. Robert Prentice (2004)
Journal of Business Ethics Education, 1(1), 57 – 74.
Obedience to Authority
"Just following orders" ("Good Nazi"
defense)
Stanley Milgram (1963) experiments.
Students need to be aware of this
potentially corrosive influence from both
formal lines of authority and non-formal
authority.
Social Proof
"Everyone else is doing it”
Pressure to conform with others in the
group of co-employees and/or friends.
Many behaviors are caused by external
influences rather than their own
disposition.
Obscenely-high executive salaries?
Options backdating
Insider trading
False Consensus Effect
Thinking that other people think the same
way that we do.
Reinforces inclinations to follow authority
and submit to peer pressure.
Honest people will tend to believe that
those they interact with are honest as well.
Employees may get involved in some
wrongdoing themselves but may not fully
recognize the ethical implications of their
acts.
Over-optimism
Humans are often overly optimistic about
OUTCOMES.
Often leads to irrational beliefs.
Divorce rate at 50% -- newlyweds tend to
rate their own chances of divorce at 0%.
Basis for unethical decisions: corporate
disclosure fraud cases could be the result
of irrationally optimistic views of a firm’s
conditions and prospects.
Overconfidence
People are often irrationally overconfident
Deals with perceptions about INDIVIDUAL
CAPACITIES.
People tend to rate themselves as well
above average in most traits, including
honesty.
Business people tend to believe that they
are more ethical than their competitors.
Overconfidence in one's own ethical
compass can lead people to accept their
own decisions without serious reflection.
Self-Serving Bias
The belief in deserved rewards for one's
self.
Affects (unconsciously) information that
people seek out to confirm rather than
disconfirm evidence.
Affects how people remember information.
Affects judgments of fairness.
Self-Serving Bias – con’t.
Confirmation bias – searching for
information that supports a conclusion and
ignoring information that disconfirms it.
Belief persistence – people tend to persist
in beliefs they hold long after the basis for
those beliefs is substantially discredited.
Causal attribution theory – people tend to
attribute to themselves more than average
credit for their company’s successes (and
less for failures)
Framing
People's risk preferences change with
context - depending on whether an option
is framed in terms of potential loss or
potential gain.
The self-serving bias may lead an actor to
frame decisions in such a way as to lead to
ethically questionable conclusions.
Example: Maximizing (shareholder) value
versus stakeholder interests
Process
People sometimes make much different
decisions depending upon whether they are
presented with a particular big decision, or
a series of incremental decisions leading to
the same point.
Slide down a slippery slope incrementally
Example: Looking the other way during
another’s errant behavior, then covering up
for another, then participating, then
conspiring.
Cognitive Dissonance
Uncomfortable psychological inconsistency
caused by incompatibility between two
conflicting beliefs or attitudes
Once people have made decisions or taken
positions, they will cognitively screen out or
reject information which undermines their
decisions or contradicts their positions.
Sunk Costs
People tend to stick by decisions into
which they have sunk significant costs.
Sunk costs can lead to an escalating
commitment.
New product development examples
Individual job investment – job, salary,
perquisites are not easily parted with.
The Tangible and
the Abstract
Decision-making is impacted more by
vivid, tangible, contemporaneous factors
Less by factors that are removed in time
and space.
Designers and marketers of new products
with safety problems
Time-Delay Traps
When an action has both short-term and
long-term consequences, the former
(short-term) are much easier for people
to consider.
People subject to this time-delay trap in
decision-making often prefer immediate
to delayed gratification.
Loss Aversion
People detest losses more than they
enjoy gains, about twice as much.
Endowment effect - the notion that we
easily attach ourselves to things and then
value them much more than we valued
them before we identified with them.
People will make decisions in order to
protect their endowment that they would
never have made in the first place to
accumulate that endowment.
Limitations:
Evidence shows that some of these
tendencies are very difficult to debias,
even with experience and training.
Nonetheless, not all attempts to debias
have been failures.
Common sense dictates educating
students and employees about these
biases and heuristics.
Why teach about heuristics and biases?
Sensitize employees to various forms of
ethical dilemmas.
Educate employees regarding their own
cognitive and behavioral susceptibilities
Educate employees about potential nonformal organizational influences and
pressures
Inoculate employees against weaknesses in
their own decision-making processes.
Largely ignored in business school and law
school classrooms in subjects of professional
ethics.