Transcript EECS 690

EECS 690
March 24
Operational Morality
• This is the kind of morality reserved for
pieces of technology that have either low
autonomy or low ethical sensitivity (or
both).
• Operational morality builds whatever value
sensitivity as seems wise into the tool.
(e.g. ergonomics, safety guards/locks,
durability, etc.)
Functional Morality
• This is reserved for pieces of technology which
have:
– Low autonomy and high ethical sensitivity (e.g. ethical decision
support systems)
– High autonomy and low ethical sensitivity (e.g. autopilots, power
grid programs, automated factory equipment)
• These pieces of technology often have elements
of operational morality in their designs, and as
systems function such that they typically serve
the function that an ethically sensitive highly
autonomous moral agent (i.e. a person) would
serve without really having one of those two
elements.
Engineering Imperatives
• Wallach and Allen have long been vocal about
advocating the position that exploring ways of
engineering ethical sensitivity into increasingly
autonomous systems is one of the most
important issues in modern ethics.
• Seeing this as an engineering imperative means
incorporating values that go beyond mere safety
into engineering projects.
Value Pluralism
• One issue that is most immediate is the building of systems that can
incorporate multiple value concerns into its functioning.
• For example, a computer program that is designed to maximize
insurance profit would consider profit as its only implicit value. The
result would be a set of rates and benefits designed to maximize the
pay-in and minimize the pay-out without any other consideration.
Such a lack of value pluralism would not be tolerated in a person.
• What Wallach and Allen seem to be suggesting is that if such a
program were likely to be used in the absence of significant human
moral oversight (i.e. if the program is intended to be given high
autonomy) then it is the imperative of the designer to include
aspects of ethical sensitivity. (HOW this is to be done is a matter for
later examination).
Morality as an important design
factor
• Police and military machines are only one
area of life that might be improved by
advanced technology that can operate
autonomously, but ethical sensitivity is a
major design factor in the practicality and
usability of such systems.
• Robocop is a fictional, but illustrative
example.
Moor’s categories:
• Ethical Impact Agents: These are machines that
can be evaluated on the basis of their ethical
impacts (this seems like it is broad enough to
encompass all machines).
• Implicit Ethical Agents: These are systems
whose designers have made obvious efforts to
ensure that some ethically negative effects are
avoided (this seems to be a constraint that
should be applied to all technology).
• Explicit Ethical Agents: These are systems that
in some way reason about ethics as part of their
normal design and function.
A note about Moor’s categories:
• In what ways do Moor’s categories overlap
and differ with Wallach and Allen’s?
• We tend to hold designers responsible for
a lack of operational morality or a lack of
implicit moral agency. Are designers also
responsible for a lack of functional morality
or explicit moral agency?