Utilitarianism
Download
Report
Transcript Utilitarianism
Utilitarianism
Nature has placed mankind under the
governance of two sovereign masters;
pain and pleasure. It is for them alone to
point out what we ought to do …the
standards of right and wrong”
The morally relevant aspects: how will an
intervention affect the suffering and utility of the
peoples concerned?
A possible reasoning:
If one can estimate that the genocide and
oppression in the long run implies more suffering
then an intervention will do in the short run and
If there is no other alternative that will imply
less suffering
then, an intervention is justified
Case: Intervention?
What is right?
Should I lie to save a person from a
difficult situation?
Should I kill a person to relieve her from
severe suffering?
Should I break a promise if this can help
someone in real trouble?
Def.
The goal/the consequences determines
the rightness of an action
Consequences for whom? (myself? “My
country right or wrong?” …)
What consequences? (fame, knowledge,
leisure, pleasure…)
Consequentialism – Teleological
ethics (telos= goal)
Jeremy Bentham, 1748-1832
Definition
The moral end to be sought is the
greatest possible balance of good over
evil
The greatest pleasure for the greatest
number of persons
Utilitarianism
“Actions are right in proportion as they tend to
promote happiness, wrong as they tend to promote the
reverse of happiness” John Stuart Mill (1806-1873)
”Everyone counts for one, nobody for
more than one” (Bentham)
”The question is not Can they reason? nor
Can they talk? but Can they suffer?
For whom?
Good
Pleasure
Happiness =
Hedonism: pleasure is the only intrinsic
value (value sought for itself)
What consequences?
Quantitative hedonism: Bentham’s felicific
calculus:
It is possible to quantify the amount of
pleasure and pain (intensity, duration…)
J S Mill: It is better to
be a human being
dissatisfied than a pig
satisfied; better to be
Socrates dissatisfied
than a fool satisfied.
And if the fool, or the
pig, are of different
opinion, it is because
they only know their
own side of the
question. The other
party of the comparison
knows both sides”
Suppose there was an experience machine
that would give you any experience you
desired. Superduper neuroscientists could
stimulate your brain so that you would think
and feel you were writing a great novel, or
making a friend, or reading an interesting
book. All the time you would be floating in a
tank, with electrodes attached to your brain.
Should you plug into this machine for life,
pre-programming your life’s experiences”
(Nozick, 1974)
C1: Is pleasure all that counts?
The pleasure machine
The criterion of a right action is the
amount of preferences satisfied
What preferences?
Interests
Needs – what is good for a person
Capabilities (Sen and Nussbaum) – what
makes a person prosper
Preference utilitarianism
C 2, Can we foresee the consequences?
C1C2 Utilitarian answer: this problem is common for all
morality
C2C2 The problem with Act-utilitarianism
Rule-utilitarianism should be preferred to Act-utilitarianism,
Def Rule – utilitarianism
Act according to the rule that has the greatest utility
C3C2 Two levels of moral thinking (R M Hare)
“The prole and the archangel”
Intuitive level - follow the rules and intuitions
The critical level (with all information etc) – determine the
right action
Objections
C 3 Should we always treat persons
equal? (“Ones own children and other´s
brats”)
C1C3 According to
utilitarianism/universalism: the best
consequences follows from a rule saying
that everyone has special obligations
C2C3 Morality is demanding!
www.thelifeyoucansave.com/
C 4 Can it be morally right to sacrifice
a one or a few persons in the interest
of the many?
Dostoevsky’s question