Transcript Slide 1
Risk Communication Challenges
for Nanomaterials:
A Taxonomy (Typology)
within a Risk Analysis Framework
Prof. Jennifer Kuzma,
co-PI
Associate Professor, U of MN
NIRT ITox Meeting
August 28, 2008
Outline
• Challenges in risk communication from a risk analysis
standpoint
• Context and framing possibilities
• Discussion of integrating concepts across disciplines-risk analysis, risk communication, science, public
policy, and public engagement
• Ties to NSF-funded research
Risk Communication goals
• Risk communication is an exchange of information about risk
among decision makers, stakeholders, and the public which is
intended to supply people with the information they need to
make informed and independent judgments about risk
– Morgan, G. et al. 2002. Risk Communication: A Mental Models
Approach. Cambridge, MA: Cambridge University Press. (p. 4)
• Not a “deficit model”, but Enabling model
• Advice and answers
• Number
• Context and Framing
A Risk Analysis Framework
Hazard identification
Exposure Assessment
Dose-Response Assessment
Risk Characterization
Risk Assessment
Risk Policy
Decision making
Risk perception
Risk Management
(including mitigation
Risk Communication
Public Engagement
Risk communication should be
the hub of policy
Powell and Leiss 1997
Mad Cows and Mother’s
Milk: The Perils of Poor Risk
Communication
A Risk Policy Problem
Powell and Leiss 1997
Risk Communication Challenges
Powell and Leiss 1997
An (outdated)
model of risk communication
Message modulators
•Credibility of messenger
•Cultural, social, and
•operational factors
•Channels of communication
Can cause distortion and
unintended messages
Filters
Knuth, B.A. (1990).
North American Journal of Fisheries Management. 10(4):374-381.
IRGC report
•
•
•
Where do nanomaterials fit?
Depends on type of nanomaterial—product dependency?
High ambiguity—more need for deliberation (context and framing approach
to risk communication)
IRGC 2006
Advice and answers, numbers, context/framing?
Environmental Risk Assessment for Nanomaterials
Monitoring, Adaptive, Feedback, Guiding Force for Risk Research
Exposure Assessment
Hazard Identification
Sources of
nanomaterials
Materialrelated
characteristics
Transport
and fate;
geochemical
processes
Biological
uptake
mechanisms
Toxic effects
on individual
species
Interactions
between
species and
geochemical
processes
Human or
Ecological
risks with
different
population of
species end
points
Risk
Characterization
Characterization of Human or
Ecological Effects
Including Dose or Stressor-Response
Assessment
Problem Formulation and Deliberative Analytical Process
Input from Experts, Stakeholders, and the Public (Interested and Affected Parties)
Ongoing and Iterative
Steps to risk communication or
deliberation (Morgan et al 2002)
•
Create expert model (influence diagram)
–
–
Diagram allows representation of knowledge of experts from diverse disciplines
“The influence diagram allows a quick, visual check of the factors that must be covered when
evaluating audience information needs”
•
•
•
Conduct open-ended (mental-model) interviews
–
People’s beliefs about hazard and risk in their own terms
–
–
How well do mental models correspond to expert model in influence diagram
Identify issues
Conduct structured initial interviews
–
–
•
Explore issues
Larger groups
Draft risk communication or deliberation method
–
–
•
Caseman and Morgan (2008)
Neutral voice
Which knowledge gaps need filling
Evaluate communication or deliberation method with individuals selected from target
population
–
“Lay evaluation”
Expert Model of risk--puzzle
•
Morgan 2005 Risk Analysis 25(4): 1621-1635
Toxic Effects Magnified
Equity in risk discussions
• Expert vs. Not Expert
• Barriers, filters, gaps, and different
mental models
• Can we level the playing field so that we
get to true differences in attitudes rather
than differences in reception and
understanding of information?
Gaps in Information
• Even Experts do not have the information
• Any individual study about risk does not put
the puzzle together
• How to communicate with stakeholders and
laypersons about risk based on one or just a
few pieces of the puzzle?
Possible approaches to test
• Map expert model onto layperson model
– Follow standard Morgan et al. approach
• Map expert studies into expert models:
– Forgo “advice, answers,” and “numbers” for
“context and framing”
– Use expert framing of risk for nanomaterials to
type individual pieces of information
– Visual and contextual translation for not-expert
audience
Possibilities to Explore for Better
Risk Communication
• Objective risk
– Database of studies mapped into risk assessment (and risk
analysis) framework
• Levels to database based on user
• (part of U of MN NSF CEIN proposal)
– For starters, use what, when, who, why, where questions to
enhance communications about risk (and toxicology or doseresponse)
• Subjective risk
– Listen and learn
– Incorporate concerns and values into risk analysis framing
of problems
– Deliberative democracy. Public engagement approaches
Puzzling together risk
U of MN CEIN grant proposal 2007
Study a
Study d
Study c
Study b
Context and Framing:
Information comes in bits and pieces
How can we enhance risk communication for individual studies?
Clearinghouse of EHS and Risk
Studies
•
•
Taxonomy of EHS information in Risk Analysis framework
– Level 1: Public, Educators, Stakeholders
– Level 2: EHS and other interested experts
– Level 3: Nanomaterial manufacturers
– Level 4: Nanoinformatics
A possible communication tool?
– Web-based information and framing tool for other printed or verbal materials
Context and Framing
Risk Analysis
Risk Assessment
Research
Typology of questions about risk
Question
Why do we care about the risk?
Subjective risk dominates, although
objective plays a role.
What is causing the harm? Hazard
identification step primarily, although
overlap with exposure assessment.
Subjective and objective risk.
Who (or what) is potentially harmed?
Endpoints of concern for exposure
assessment and hazard identification.
Subjects of dose-response studies and
ultimately risk characterization.
Subjective and objective risk.
How might the harm occur?
Hazard origins and exposure routes.
Objective risk dominates, but subjective
risk plays a role.
Where does the harm take pl ace?
Endpoint and exposure environments.
Subjective and objective risk.
When does the risk occur?
Exposure timing and hazard
identification. Objective risk dominates,
although subjective risk plays a role
Examples
Social concern about hazard
Previously demonstrated harm
Societal value placed on endpoints of harm
Hazard with parallels to those with known significant risk
Political interest
Biological based nanomaterial (e.g. nanoparticle made of protein or
DNA, viral b io-nanotechnology)
Chemically manufactured or engineered nanomaterial
Modified, natural nanomaterial
Free-floating nanoparticle in air, soil, water, etc.
Bound nanomaterials
Modified state of nanomaterial fro m environ mental travel
Mixtures of nanomaterials and other chemicals
Hu man populations generally
Higher organis ms
Ecosystems generally
Microbes in water or soil
Susceptible populations of humans (e.g. children, elderly )
Particular populations of humans (e.g. immigrants, disadvantaged
groups, indigenous peoples)
Intentional release (e.g. pesticides, environmental remediat ion)
Non-comp liance with material d isposal
Unintentional leakage or leaching
Accidental
Through air, water, soil, products, or food
By inhalation, dermal, or ingestion routes
Workplace
Hu man natural environ ment
Hu man built environ ment
Ecosystems (wet lands, agroecosystems, etc.)
Ho mes (e.g. fro m consumer products)
Seasonal
Acute
Chronic
Dependent on temporal co mpliance with oversight
Dependent on environmental conditions
Trust-credibility
• Address subjective risk component (deliberation, engagement)
• But recognize that subjective (social?) and objective
(epistemic?)dimensions of risk are not that distinct
– Fischhoff, B., S.R. Watson, and C. Hope. 1984. Defining risk.
Policy Sciences 17: 123-139.
– Kuzma and Besley 2008. Ethics of Risk Analysis and Regulatory
Review: From Bio to Nano. Nanoethics in press, online.
• Prevent biased (and exaggerated) communication of individual
study results
– E.g. GE Corn and Monarch butterfly story, Losey article and first
round of field trials (see Pew Initiative on Food and Biotechnology
report, 2002)
• “Neutral” communication bodies for objective information and
subjective risk discussions
Ethics of Risk Analysis and Regulatory
Review: From Bio to Nano
Kuzma and Besley 2008, Nanoethics
Factors in Risk Comparisons &
Perception
•
•
•
“Risk” not necessarily equal to
the # of fatalities
Experts perceive differently
Laypersons
– Benefits, Trust, Affect
Important
– Product dependent for
nanofood
– Siegrist 2007, 2008
“Thus, disagreements about risk
should not be expected to
evaporate in the presence of
evidence” (Slovic et al 1990)
•
“Risk Perception Factors”
–
–
–
–
–
–
–
–
Natural/Man-Made
Ordinary/catastrophic
Voluntary/Involuntary
Delayed/Immediate
Controlled/Uncontrolled
Old/New
Necessary/Luxury
Regular/Occasional
Rasmussen, Slovic, Fischhoff, et al. 1990,
in Readings in Risk
Stages of Risk Communication
Morgan et al. 2002
•Given “fuzziness” of risk and risk analysis itself, trust, and perception factors
•(S. Priest)
•Need to move beyond stages, separation of objective/subjective
• To Integration, Enabling, Contextual, Analytical-Deliberative process (NRC 1996)
NSF-NIRT Grant
1. What factors are most
significant in affecting public
perception of the risks of
applied nanosciences?
2. What, if any, relationship
exists between the modes of
public deliberation, sources
of information (e.g., use of
new media), and the effects
of new information on
perceptions of the risks of
applied nanosciences?
•
refine and develop key variables
and instruments (stage 1)
•
determine the contribution of
variables to perceived risk—Delphi
rounds (stage 2)
–
Will framing help level playing
field between experts and nonexperts?
•
elucidate the effect of civic
engagement and new media on risk
perception (stage 3)
•
verify key variables related to risk
perception—focus groups agrifood
nano (stage 4),
•
outreach to the public (stage 5).
Other NIRT—Oversight Lessons for Nanotechnology
Evaluation of six historical models
National Science Foundation NIRT Grant SES-0608791
(Wolf, PI; Kokkoli, Kuzma, Paradise, Ramachandran, Co-PIs).
Step 1: Develop exhaustive list of criteria
Qualitative Literature Analysis, informed by Expert and Stakeholder Consensus
Step 2: Criteria Ranking
Of Importance for Oversight Model Evaluation
Expert and Stakeholder Elicitation , also Informed by Literature Analysis
Criteria fall into four categories
for oversight:
Development, Attributes,
Outcomes, and Evolution
Step 3: Application of Criteria to Each Historical Model:
Expert and Stakeholder Scoring of How Well Each Model Performs on Each Criterion
Also informed by semi-structured expert and stakeholder interviews
Step 4: Evaluation of Relationships Among Criteria
Expert and stakeholder semi-structured interviews
Analysis of Criteria Scores: relationships of criteria within historical model
Development of Influence Diagrams
Step 5: Comparison across historical models
Qualitative and Quantitative analysis
Comparing criteria scores across models
Comparing influence diagrams across models
Conclusions for Oversight: What makes an effective oversight system?
Do certain ways of developing oversight frameworks lead to certain outcomes ?
Do specific attributes of oversight frameworks lead to certain outcomes ?
How does development affect system attributes ?
What are apprpopriate oversight approaches for nanobiotechnology?
Other NIRT—Lessons for Nanotechnology
Evaluating Oversight Models for Nanobiotechnology
National Science Foundation NIRT Grant SES-0608791
(Wolf, PI; Kokkoli, Kuzma, Paradise, Ramachandran, Co-PIs).
Expert and Stakeholder Interviews
How was the
oversight model
developed ?
What are its
outcomes ?
Quantitative
Qualitative
Normative
Ethical, policy,
legal, and risk analysis
Expert Elicitation
•
What are its
attributes ?
Historical Literature Analysis
Experts asked to rank how six case
studies of oversight have performed
on scale of 1-100 on 28 criteria
How do the
attributes evolve
over time ?
Relationships among criteria
p<0.0016
Relationships:
Attributes of GEOs oversight to public
confidence as an outcome
p<0.05
R=approx. 0.5 for all
Data
requirements
A9
Public
Confidence
O24
Public Input
D4
Incentives
A14
Public Input
A19
Hypothesis: public input is important for outcome of
public confidence in oversight systems.
Messages and Synergies among
NSF-NIRTs and fields of
investigation
• Consider attributes of oversight as
possible factors in perception of new
technologies (risk perception too)
• Finding relationships among attributes
of oversight and outcomes such as
public confidence, health and safety,
and environmental impacts.
Thank you
• NSF NIRT grant Intuitive Toxicology
and Public Engagement (#0709056)
• (PI: David M. Berube, co-PIs, Dietram A.
Scheufele,Jennifer Kuzma, Kevin Elliott, Pat
J. Gehrke, V. Colvin).
• Contact info, J. Kuzma
– 612-625-6337, [email protected]