RD300 Science, Value..

Download Report

Transcript RD300 Science, Value..

Science, Values and Risk
RD300
15 October 2001
“It is by no means uncommon to
find decision makers interpreting
the same scientific information in
different ways in different
countries.”
(Jasanoff, 1991, p.29)
Cultural Variation
• U.S. Environmental Regulators more
highly value formal analytical methods
(testable validity) than do their
European counterparts. US regulators
tend to address scientific uncertainty
through quantitative analysis.
• Result: Evidence sufficient to trigger
action in one country may not do so in
another.
The Problem with Policy-Relevant Science
• When knowledge is uncertain or
ambiguous facts alone are
inadequate to compel a choice.
• Policymakers inevitably look beyond
just the science and blend scientific
and policy considerations together in
their preferred reading of the
evidence.
Risk Assessment
• Different risk assessment methodologies
can produce widely varying risk estimates.
• Can animal data be extrapolated to
humans?
• Do policy makers hide behind the
numbers?
• Most lay persons don’t understand
quantitative risk assessments.
• Value judgments and uncertainties in
risk assessments may not be stated
by the experts.
• Risks of less than one in a million are
often considered negligible from a
regulatory standpoint.
Judgmental Probability Encoding
• Field of US health risk assessment.
• Attempts to ascertain the range of
scientific expert opinion on a particular
risk as well as the levels of confidence
attached to each of those judgments. (e.g.
ambient air quality standards)
• Has proven to be problematic (e.g. biased
selection of experts).
British Approach
• Multi-stakeholder commissions with noted
academics and major interest groups.
Collective credibility.
• Unlike US approach, risk assessment and
risk management are examined together.
(science and policy)
• With respect to lead and the risk to
children’s health, they were equivocal in
their findings and reported no
persuasive evidence of a risk.
• Described the risk in qualitative
(“small”) rather than numerical terms.
• Yet they recommended that lead
additives be phased out of gasoline.
• Interpreted the Precautionary Principle
as: “dangerous until proven safe”.
Dealing with uncertainty.
USA vs Britain:
Administrative and Political
Cultures
• Regulatory processes:
– Britain – consensual, non-litigious, relatively
closed.
– USA – adversarial, litigious, open.
• USA – regulatory process more open to
political pressures. Quantitative analysis
becomes a “lifeline to legitimacy”.
Slovic Article
• “the goal of informing the public about
risk issues – which in principle seems easy
to attain – is surprisingly difficult to
accomplish.”
• Why?
Three Categories of Reasons
• Limitations of risk assessment.
• Limitations of public understanding.
• The problems of communicating complex
technical information.
Limitations of Public Understanding
• The public’s perceptions of risk are
sometimes inaccurate.
– Memorable past events
– Imaginability of future events
– Media coverage can influence
– Overestimate dramatic causes of death.
How good are the public at estimating
risks?
• Rare causes of death tend to be
overestimated while common causes are
underestimated.
• Example: Most people think their chances
of dying of a heart attack is about 1 in 20.
The truth is closer to 1 in 4.
• Judgmental bias - people’s predilection for
exaggerating their personal immunity
from many hazards. “Optimistic bias”.
• Risk information may frighten and
frustrate the public.
– Simply mentioning a risk may enhance
perceptions of danger.
– Even neutral information may elevate fears
(e.g. transmission lines)
– People may try to reduce their anxiety about
a hazard and its uncertainty by denying its
existence or in their minds making the risk
smaller than it is.
• Strong beliefs are hard to modify.
“strong beliefs about risks, once formed,
change very slowly and are extraordinarily
persistent in the face of contrary
evidence”. Vincent Covello
People gravitate or tend to accept evidence that
supports their pre-existing beliefs on the subject.
• When people lack strong opinions they
can be easily manipulated by presentation
format.
– “framing effects”
– Ethical issues
Expert versus Lay Conceptions of Risk
• Risk experts employ a technical evaluation
of risk:
Risk = Probability x Consequences
• The public applies a broader conception
of risk that also incorporates:
accountability, economics, values, and
trust.
• As our technical control has increased in
the technological age, our social control
has decreased.
• “Most citizens’ calls for ‘scientific’
decisions, in reality, are a request for
something a bit broader ---in most
cases, a call for ways of assuring that
‘the human element’ of societal decision
making will be not just technically
competent, but equitable, fair, and
responsive to deeply felt concerns”
Freudenburg
Should Zero-risk be the goal?
As Harvard professor John Graham has said,
“We all want zero risk. The problem is if
every citizen in this country demands zero
risk, we’re going to bankrupt the country”.
• Perceptual cues (e.g. odor) may signal
more ominous events.
• Risk as a ‘collective construct’ - cultural
theory of risk.
• Studies have found cross-national
differences in risk judgments.
• Value orientation influences risk
perceptions as do worldviews.
The Mad Cow Crisis
• In March 1996, the British government
announced that scientists had linked
Creutzfeldt-Jakob disease with the human
consumption of cattle with bovine
spongiform encephalopathy (BSE) or “mad
cow disease”.
• For almost a decade British authorities
had insisted there was no risk of BSE
being transferred to humans.
• With the March 1996 announcement,
the British beef market collapsed
virtually overnite.
• The EU banned the export of British
beef.
• Consumption of all beef in countries
such as France, Germany and Japan
dropped significantly.
The scientific question at the heart of
the BSE crisis:
Can humans develop CJD after eating
beef from cattle infected with BSE?
In other words, can the infectious
agent jump the species barrier?
Public Perception
• That the British government was more
interested in propping up the beef
industry rather than admitting that there
may be a risk, however small that risk
might be.
• People stopped buying beef because they
no longer trusted the government.
Risk Characteristics of the Mad-cow
Disease Crisis.
• High level of dread of the disease.
• Scientific uncertainty.
• Possible involvement of children.
• Catastrophic potential.
• Non-voluntary exposure.
• Lack of trust in decision-makers.
• A history of food safety controversies.
What mistakes did the British
Government make in handling
the issue of mad cow disease ?
What lessons can be learned
from the mad cow crisis?