Research Ethics 2, The Common Rule

Download Report

Transcript Research Ethics 2, The Common Rule

Foundations of
Research
1
Research ethics
Research
Ethics:
 The Common Rule
 The Belmont Report
This is a PowerPoint Show

Click “slide show” to start it.

Click through it by pressing any key.

Focus & think about each point; do not
just passively click.

To print:

Click “File” then “Print…”.

Under “print what” click “handouts (6
slides per page)”.
© Dr. David J. McKirnan, 2014
The University of Illinois Chicago
[email protected]
Do not use or reproduce without
permission
Recruitment flyer for Stanly Milgram’s study
of obedience.
From Boingboing.net
Foundations of
Research
The “Common Rule” criteria for Human Subjects Protection
The Common Rule
Research institutions are mandated to have boards –
Called Institutional Review Boards – that provide ethical
review of all Federally funded research.
That review follows seven elements, called the Common
Rule.
Most Universities apply the same standard to all research,
whether federally funded or not.
2
Foundations of
Research
The Common Rule
The Common Rule

Minimize risks

Risks must be reasonable

Recruit participants equitably

Informed consent

Document consent

Monitor for safety

Protect vulnerable participants &
maintain confidentiality
3
Foundations of
Research
Minimization of risks
4


Core issue:



Risks should not exceed those of everyday
behavior.

Potential harms from research:


Simple inconvenience, loss of time, money…
Withholding care or treatment
 Use of deception in experimental manipulation

Physical harm

Psychological harm
 Social or political harm


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Foundations of
Research
5
Potential harms from research


Potential harms:




Simple inconvenience:



Any research study, even an innocuous survey, has
some “costs” to the participant.

Time, distraction, possible $ for transportation…


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
All research must be justified by it’s potential
contribution to science or society.
Withholding care
Direct:
Indirect:
Tuskegee-like
Clinical trials, wait-list designs
 Use of deception in experimental manipulation
 Can people provide informed consent if they are deceived?
 Possible embarrassment or negative shift in self-perception.
 Deception erodes trust & confidence in social science.
 Placebo arm in treatment trial
Foundations of
Research
6
Potential harms from research


Potential harms:




Simple inconvenience



Withholding care
The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
 Direct;
• The Tuskegee trial was a glaring example of the intentional or
direct harmful withholding of care.
• When a known effective treatment was discovered it was withheld
specifically to examine disease progression among untreated men.
 Indirect;
• In clinical trials (of a new drug, behavioral treatment…) the control
group does not receive the intervention.
o
o
•
Placeb control; a spurious or emty treatment.
Wait-list control; intervention is phased in only after the experimental
group has finished the treatment process.
These approaches are considered ethical when:
o
o
The informed consent makes the trial structure clear;
It is unknown whether the experimental treatment “works”.
Foundations of
Research
7
Potential harms from research





Simple inconvenience
Withholding care
Physical harm





The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
 The history of behavioral and biomedical research contains many
instances of intentional or unintentional physical harm.
(Wikipedia has a good review here.)
•
•
Direct or intentional harm;
o
Participants are intentionally exposed to (potentially) toxic or harmful
drugs or physical circumstances.
o
The goal of the research is to examine the human effects of noxious
conditions.
Indirect or unintentional harm;
o
Participants are exposed to conditions that may be harmful.
o
The harmful exposure is not a goal of the research, but nonetheless
has a high probability of occuring.
Foundations of
Research
8
Potential harms from research



Physical harm


Direct /
intentional
harms:
 Cold War radiation experiments,
During the 1950s – 60s, in preparation for a potential
nuclear war, the U.S. military exposed hundreds of
service men and women to varying doses of radiation
from nuclear tests.
Soldiers (and civilians) were tested for radiation
exposure effects, both to assess harms and to convince
the public that nuclear war could be “safe” to civilian
populations.



The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Click for
archival
footage on
YouTube
Soldiers being exposed to a nuclear explosion at
the Nevada Test Site in 1951.
http://www.teoti.com/photography/123738-historicalpictures.html
Image: imgur.com, public domain
Foundations of
Research
9
Potential harms from research


Physical harm




Direct
Harms:
 Cold War radiation experiments,
 Army psychoactive drug research


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
The U.S. army performed a variety of harmful drug studies before ethical standards
were firmed in during the 1990s.
In the 1960s through the 70s service members and, in some cases, prisoners were
given varying doses of LSD and other psychotropic drugs to test their possible use as
psychoactive chemical weapons.
Most of these exposures were with participants’ foreknowledge and consent. At other
times drugs were slipped into food to test their effects when unanticipated.
Click below for an excellent piece in The New Yorker.
Foundations of
Research
10
Potential harms from research


Physical harm




Direct
Harms:
 Cold War radiation experiments,
 Army psychoactive drug research,
 Drug trials in prison populations


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
The great majority of psychoactive drugs used today – for sleep disorder, anxiety or
depression, psychotic symptoms… - were initially tested on prisoners.
Other testing administered known or potential toxins to prisoners to gauge their effects.
Pharmaceutical companies would work with State officials to gain access to prisoners.
Prisoners “volunteered” for the studies, although were rarely fully informed about the
agents they were taking.
The shameful history of this testing led to very
strict regulations in the 1990s. Today there is
renewed debate about the prospect of prisoner
testing. Click the image for a NYTimes.com
article.
Foundations of
Research
11
Potential harms from research


Physical harm




Direct
Harms:




Cold War radiation experiments,
Army psychoactive drug research,
Drug trials in prison populations,
Some forms of Animal research


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Animal research has long been contentious for both research and commercial safety. As
with prisoner research, animal testing has an ugly history, but has made invaluable
contributions to human health.
PETA, of course, opposes any animal testing (click image).
NIH and organizations such as Foundation for Biomedical
Research argue in Defense of responsible animal testing.
Scientific American calls for ban on animal testing for
Cosmetics. Many companies advertise their products as
animal-testing free, although many use ingredients shown to
be safe through prior animal work.
The Scientist has an excellent series on animal testing,
including the increasing number of alternatives to animals.
Although the U.S. Food and Drug Administration
does not require animal safety testing for
cosmetics—a category that includes skin cream,
perfume, makeup and shampoo— animal tests
are still used. Credit: Thinkstock
Foundations of
Research
12
Potential harms from research







For more perspectives on Animal research click
the images.
The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Second Thoughts of an Animal
Researcher
Is Animal Research Ever Ethical?
Rebuttals and comments on Gluck
JOHN P. GLUCK, New York Times, SEPT. 2, 2016
New York Times, SEPT. 12, 2016
University of California, Berkeley, 1967. Credit Henri CartierBresson/Magnum Photos
A University of Wisconsin-Madison researcher with a pregnant
monkey infected with the Zika virus. Credit Scott Olson/Getty Images
Foundations of
Research
13
Potential harms from research


Physical harm




Direct
Harms:
Indirect
harms:




Cold War radiation experiments,
Army psychoactive drug research,
Drug trials in prison populations,
Some forms of Animal research


The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
 Behavior induced by the experiment:
 Experimental conditions that encourage
risk, alcohol & drug use, smoking…
Many studies on health behavior assess the effects of different conditions – say, modeling
by attractive peers – on behaviors such as alcohol or tobacco use.
These studies can be ethical if participants are carefully screened (e.g., for no evidence of
alcohol abuse…) and monitored during the experiment.
Ethical constraints must be very strict in this area, particularly if deception is involved.
(For example, in the “Balanced Placebo” design for examining alcohol-related behavior participants in
one cell of the design expect to not receive an alcoholic beverage, but actually do.)
Foundations of
Research
14
Potential harms from research






Simple inconvenience
Withholding care
Physical harm





Psychological harm
 Discomfort or pain
The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Experiments that induce anxiety or
negative moods.
Many studies on issues such as emotions and alcohol use, coping with stress or change, or
adjusting to novel environments can temporarily induce negative reactions or moods.
These studies remain ethical given that they have carful:

Screening, to eliminate highly vulnerable participants;

Monitoring, to assess participants’ state during the study;

Debriefing, to provide information and provide a chance to return to a normal state.
(The same is true when administering alcohol…).
Foundations of
Research
15
Potential harms from research



Psychological harm
 Discomfort or pain
 Changed selfperception


Experiments that induce anxiety or
negative moods.
E.g., leading participants to believe
they were capable of harming
others



The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
One of the most famous research programs crossing this ethical line were the
obedience studies conducted by Stanley Milgram in the 1960s.
Images: Simplypsychology.org/
Click image for an excellent overview from SimplyPsychology.
Foundations of
Research
Potential harms from research
Research harms; changes in
self perception.
• In 1962, in the wake of the Nuremberg Trials of Nazi war crimes, Stanley
Milgram wanted to discover how common citizens could be led to commit
clearly evil acts.
• He recruited people for a “Memory study”, where they thought they were
serving as instructors or research assistants.
• Milgram hired actors to portray the ostensible “Learners”.
• Milgram thus used deception in his recruitment materials and study
instructions.
•
Any deception study is ethically problematic because participants are not providing
accurate informed consent.
•
Studies that use deception to hide hypotheses from participants typically have no
other potential study harms.
•
Milgrams’ use of deception was unethical; Participants consented not knowing they
would be subjected to substantial stress.
16
Foundations of
Research
Potential harms from research
17
Research harms; changes in
self perception.
• The “learner” was attached to (phony) electrodes, in a separate room from
the “instructor”, who sat with the experimenter.
• Each time the “learner” (an actor…) missed an item on a memory task, the
“instructor”(the actual participant) was told to administer increasingly intense
shocks.
• The study assessed how high a shock participants would administer as the
“learner” consistently gave wrong answer.
•
Or did they? A recent re-re-analysis suggests that
Many participants were obedient enough
to take
thenot
shock
to levels by
they
participants
were
that disturbed
thethought
study.
were lethal, while the “student” showed increasingly extreme discomfort..
• Bottom line: many ordinary people will harm others when commanded to by an
authority.
• Of course, participants then had to live with the knowledge that they were willing
to follow instructions to that extreme.
Click for Dar Williams’ song about the Milgram
experiment and fascism in Indy Magazine.
Foundations of
Research
18
Potential harms from research


Research harms; changes in
self perception.




The Stanford Prison Study

The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Click for
the study web
In 1971 the U.S. had been subject to several tragic prison riots. Political debate
focused
on
site.
whether criminals – the great majority Black and poor – were cognitively or psychologically
“impaired” to the extent that they could not function in society.
In contrast to “individual pathology” perspective, sociological research at the time described
the effects of “total institutions” such as prisons or psychiatric hospitals, that themselves
created deviant behavior.
Phillip Zimbardo set out to test this experimentally, by constructing a realistic prison in the
Stanford Psychology Dept. basement, and randomly assigning male undergraduates to the
roles of prisoners or guards.
Consistent with the hypothesis, the students – all affluent, white, well educated and
screened for psychological health – acted little different than those at Attica State Prison,
where riots had taken place.
As with the Milgram experiment, participants were left with the knowledge that, whether
prisoner or guard, they were capable arbitrary brutality.
Foundations of
Research
19
Potential harms from research



Psychological harm
 Discomfort or pain
 Changed selfperception
 Embarrassment


Experiments that induce anxiety or
negative moods.
E.g., leading participants to believe they
were capable of harming others



The Common Rule
Minimize risks
Reasonable risks
Equitable recruitment
Informed consent
Document consent
Data monitoring
Protect vulnerable
participants, maintain
confidentiality
Loss of confidentiality or privacy
Many studies address coping or risk behavior in populations that may be stigmatized.
For example, the author conducted AIDS research among gay and bisexual men from the
1980’s forward. During much of that time being “outed” as gay/bisexual could lead to
harassment, job loss, or actual violence. As more men (and women) became infected with
HIV, that status became a second important source of stigma.
Loss of confidentiality through sloppy record keeping, or even recognizing a research
participant on the street, could have real adverse consequences.
Research with potentially stigmatized people – whether due to poverty, criminal background,
abuse history, or otherwise – requires very careful protections to ensure that vulnerable
participants are not socially harmed just by being identified as a research participant.
Foundations of
Research





20
Potential harms from research
Simple inconvenience
Withholding care
Physical harm
Psychological Harm
Social or political harm
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for
safety

Protect vulnerable
participants & maintain
confidentiality
 Portraying social groups in a negative light, e.g.:
• Interpretations of research results that suggest lower
intelligence among lower socio-economic status or minority
participants;
• “Pathologizing” minority groups by problem-focused
research and reporting:
o Research funding addressing poor, GLBTQ, minority and other
populations typically focuses on problems: drug abuse, domestic
problems, criminality…;
o Although these are important topics, the consistent portrayal of a
community in ‘problem’ terms can enhance negative stereotypes.
 Ignoring or relying too strongly on some groups: e.g., women or
minorities in clinical research.
Foundations of
Research
21
Potential harms from research
What makes research vulnerable to causing harm?

Strong financial pressure for study results:



Advancement in the research industry - promotions, recognition, getting grants or
contracts ($) – requires that investigators consistently publish positive results.
Simple bias or prejudice:



This creates substantial monetary pressure to produce positive results, potentially
making ethical considerations secondary.
Publication & grant pressure:


The ability to market clinical interventions – new drugs, medical devices, behavioral
programs – depends upon positive results of efficacy trials.
Most researchers are Caucasian, with middle or upper-middle class backgrounds;
The “Ivory Tower” syndrome; it is common for researchers have little direct
experience with the populations they are studying.
Lack of institutional controls

Active Institutional Review Board

Study monitors
Up-front ethical reviews & ongoing monitoring
for ethical compliance can prevent potential
harms.
Foundations of
Research
22
Potential harms from research
The Common Rule
Prevention of research harms:

Independent & rigorous Institutional Review Board
[IRB]:
• An IRB is a set of active researchers, who review and
monitor proposals for their adherence to the common rule.

Diversity among investigators, research centers.

Study monitoring:
•

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for
safety

Protect vulnerable
participants & maintain
confidentiality
We will see examples of good and poor monitoring below.

Careful pilot testing & monitoring of study manipulations

Informed consent and debriefing to:
a) fully inform participants about the study
b) eliminate any imposed state, e.g. temporary stress, negative affect…
NIH.gov, public domain
Foundations of
Research
Reasonableness of risks
Core issue:

Whether a risk to participants is reasonable rests
on a Cost – benefit analysis

23
Does the study present risks greater than every-day life?
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
• How much greater, for how long?

Are the potential harms – ‘costs’ – justified by the likely knowledge to be
gained, i.e., potential benefits?
 Potentially harmful research may be justified if it provides
invaluable data, e.g.:

Milgram obedience studies;

Zimbardo prison experiment;

Intrusive animal studies;

Studies that pay people to take medical or behavioral risks.
Foundations of
Research
24
Reasonableness of risks
 Research with little harm may still be
unjustified if it will not provide useful data:

Research always requires time, effort, potential
embarrassment…;

Research that is trivial or incompetent may be
inherently unethical for those reasons.
 Weighing risks against potential benefits is
difficult and complex…
 We lack a scientific metric for evaluating degrees of
potential harm.
 The benefits of research are rarely guaranteed.
 Even research with scientific or applied benefit
often has no direct benefit for the study participants
themselves.
The Zimbardo Prison Study
subjected college students to
real stress…
…but provided valuable data
about how social roles and the
physical setting affect behavior.
Click the image for an overview
of the study.
© 1999-2015, Philip G. Zimbardo,
http://www.prisonexp.org/the-story/
Foundations of
Research
Causes of unreasonable research risk:



25
Reasonableness of risk
The “Costs” of participation may not be fully understood by
investigator;
Benefits may be overstated or not framed in terms of target
population;
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
Publication & grant pressure to recruit participants at any
cost.
Prevention:

Independent & rigorous Institutional Review Board [IRB];

Community Advisory Board [CAB]; A board of
people from the target population;

Pilot testing to assess the actual risks of the
research.
ShutterStock
Foundations of
Research
Participant Recruitment
Equitable recruitment:

26
People potentially affected by research must:
 Have the opportunity to join the study;
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
 Have the opportunity to withdraw once enrolled;
 Not be coerced or deceived into enrollment.
 How do we ensure that all social groups are represented?
 Some groups are less likely to enlist than are others;
 How much should Investigators try to overcome peoples’ reluctance to join
a study?
 At what point does that become coercive?
 Some groups – drug abusers, those in poverty – will respond to even a
small monetary incentive to join even high risk studies.
 Is offering a monetary incentive to those individuals coercive?
 Are they treated inequitably by not being offered money?
Foundations of
Research
Potential problems in recruitment:

Arbitrary bias in sampling

27
Participant Recruitment
Excluding – or only including – groups for reasons unrelated
to the research protocol:
o Convenience or ease of recruitment, such as students,
prisoners;
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
o Simple bias, such as toward men in medical research.

Using highly unrepresentative samples:
o E.g., over-reliance on students or “for hire” internet lists in social /
behavioral research;
o
Over-reliance on poor people (who strongly respond to financial
incentives) in medical or behavioral research.
Foundations of
Research
Potential problems in recruitment:

Arbitrary bias in sampling



28
Participant Recruitment
Excluding – or only including – groups for reasons unrelated
to the research protocol.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
Using highly unrepresentative samples:
Coercive payments or incentives


Culturally or socio-economically coercive:

Financial or similar incentives (e.g., burial insurance during Tuskegee).

Provision of otherwise unavailable human or health services.
Potential loss of benefits:



Recruitment by one’s own physician, teacher, supervisor…
These create a dual role situation where the recruiter is also in a position of
power or influence.
Although there may not be direct coercion, the individual may fear a loss of
good favor.
Foundations of
Research
Potential problems in recruitment:

Arbitrary bias in sampling




29
Participant Recruitment
Excluding – or only including – groups for reasons unrelated
to the research protocol.
Using highly unrepresentative samples.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
Coercive payments or incentives

Culturally or socio-economically coercive.

Potential loss of benefits.
Deceptive descriptions of the study;

Simple misinformation about potential harms or potential benefits.

The Therapeutic misconception:

Potential participants may not fully understand that they are enrolling in a
research trial, and that the treatment is still unproven.

Many participants (particularly with lower education) assume they are
being enrolled to receive treatment, not to test a treatment.

The informed consent must make this clear, and investigators have an
ethical responsibility to ensure participants understand what a trial is.
Foundations of
Research
Potential problems in recruitment:

Arbitrary bias in sampling





30
Participant Recruitment
Excluding – or only including – groups for reasons unrelated
to the research protocol.
Using highly unrepresentative samples.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
Coercive payments or incentives

Culturally or socio-economically coercive.

Potential loss of benefits.
Deceptive descriptions of the study;

Simple misinformation about potential harms or potential benefits.

The Therapeutic misconception.
Ability to comprehend the research & provide informed consent;

Children, elderly, developmentally delayed, mentally ill … require extra precautions
to ensure they are capable of consenting to participate.
o Often this will consist of a person (i.e., a relative, someone with medical power of
attorney) who can consent on behalf of the participant.
o This is particularly important for complex or long-term research protocols.
Foundations of
Research
Potential problems in recruitment:

Arbitrary bias in sampling





31
Participant Recruitment
Excluding – or only including – groups for reasons unrelated
to the research protocol.
Using highly unrepresentative samples.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
Coercive payments or incentives

Culturally or socio-economically coercive.

Potential loss of benefits.
Deceptive descriptions of the study;

Simple misinformation about potential harms or potential benefits.

The Therapeutic misconception.
Ability to comprehend the research & provide informed consent;

Children, elderly, developmentally delayed, mentally ill … require extra precautions
to ensure they are capable of consenting to participate.
o Often this will consist of a person (i.e., a relative, someone with medical power of
attorney) who can consent on behalf of the participant.
o This is particularly important for complex or long-term research protocols.
Foundations of
Research
32
Informed consent
 Key elements of the Informed Consent document:
 Purpose & procedures of the study.
 Why the participant was recruited.
 Study requirements and duration.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
 Possible risks or harms,
 The study is voluntary & the participant can withdraw at any time.
 Any potential benefits or costs of participation.
 Who to contact for information / concerns, including the IRB.
 Written signature.
 How do we know the participant understood?

Administer consent quiz, or personal interview.
 How to document consent?
 Written signature (with the name kept confidential).
 For studies where participants are anonymous the IRB can waive written
consent.
Foundations of
Research
33
The Common Rule
Informed consent
Deception in experiments:
 How do we provide informed consent if the
participant cannot know the hypothesis?

The study must present no risks of harm;

The participant must be thoroughly debriefed after
the study.

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Data monitoring for safety

Protect vulnerable
participants & maintain
confidentiality
 Deception can erode trust & confidence in social
science.
The consent document is one of the most closely examined
issued in IRB ethical reviews.
Foundations of
Research

Monitoring of clinical trials;

Studies of a new drug or treatment.

Behavioral intervention studies.


The Common Rule
Data and clinical trial monitoring
Two key elements of Research Monitoring:

34

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants & maintain
confidentiality
Participants are followed over time with multiple study visits.
Data Safety Monitoring Boards;

Independent bodies that oversee data collection and analyses.
Foundations of
Research
35
Data and clinical trial monitoring
Monitoring for safety:
 Trial monitoring
 Data Safety Monitoring Board
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety


Investigators in clinical trials are required to monitor and
Protect vulnerable
participants & maintain
confidentiality
report any health or safety “adverse events”.


Trial related – due to a feature of the trial protocol:

E.g., heart complications during trials of weight loss drugs;

Can be deemphasized or ignored in trials testing products;

Often ignored in behavioral intervention studies.
Non-Trial related;

Deaths are common during longitudinal studies of serious health problems:



Cancer, drug or alcohol abuse research.
Other events may simply stem from everyday life, e.g., auto accidents. of
injection drug users
Trial-related Serious Adverse Events may require a protocol change
or may stop the study.
Foundations of
Research


1992; Physicians begin prescribing a combination of Fenflouramine
and Phentermine (“Fen-Phen”) for obesity, with no FDA approval.
1996; After clinical trials by the manufacturer the FDA approves
Redux, a Fen-Phen drug.



Fen-Phen; a case study of failed trial
Monitoring
The manufacturer reported 4 cases of severe cardiac effects during
trial monitoring, despite 41 having actually occurred.
36
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants & maintain
confidentiality
The Food and Drug Administration bypassed staff who had
concerns, and approved the drug without a “black box” warning.
 Pharmaceutical Companies can have substantial influence in FDA decisions

Wyeth spends $52 million promoting the drug, garnering $300 million in annual sales.

Time Magazine notes Fen-Phen as the hot new diet drug, but raises safety questions.
1997; 30 year-old woman dies of cardiac event after taking Fen-Phen for a month.

1998; Mayo clinic finds multiple cases of cardiac problems in women
taking Fen-Phen

FDA receives 144 Adverse Event reports; 30% of patients taking FenPhen show cardiac abnormalities. Fen-Phen pulled from market.

2003; Forbes Magazine reports 153,000 law suits against Wyeth, who
pays out $13 billion in settlements.
Click the image for the PBS Frontline
documentary Fen Phen Nation
Foundations of
Research


Fen-Phen; a case study of failed trial
Monitoring
1992; Physicians begin prescribing a combination of Fenflouramine
and Phentermine (“Fen-Phen”) for obesity, with no FDA approval.
1996; After clinical trials by the manufacturer the FDA approves
Redux, a Fen-Phen drug.

The manufacturer reported 4 cases of severe cardiac effects during
trial
monitoring,
despite
41 having actually
Here
a large
company
hadoccurred.
a strong motive


37
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants & maintain
confidentiality
for
The
Food and Drug monitoring;
Administration bypassed
whoare
had a multiinadequate
diet staff
pills
concerns, and approved the drug without a “black box” warning.
billion
dollar a year
industry.
Large Pharmaceutical
Companies
can have substantial influence in FDA

decisions

Administration [FDA] was monitoring the trial,
the
company
intentionally
data that
1997; 30
year-old
woman dies
of cardiac eventwithheld
after taking Fen-Phen
for a month.
would have
1998 ;stopped
Mayo clinic approval.
finds multiple cases of cardiac problems in women


Wyeth
Even
though the Food and Drug
spends $52 million promoting the drug, garnering $300 million in annual sales.
Time Magazine notes Fen-Phen as the hot new diet drug, but raises safety questions.

Fen-Phen
 This is taking
also example of industry research

FDA receives 144 Adverse Event reports; 30% of patients taking Fen-

2003; Forbes Magazine reports 153,000 law suits against Wyeth, who
pays out $13 billion in settlements.
having Phen
a corrupting
on FDA
decisions.
show cardiac effect
abnormalities.
Fen-Phen
pulled from market.
Click the image for the PBS Frontline
documentary Fen Phen Nation
Foundations of
Research
38
Data and clinical trial monitoring
Monitoring for safety:
 Trial monitoring
 Data Safety Monitoring Board
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety


Protect vulnerable
participants & maintain
confidentiality
The DSMB monitors:

Trial integrity; is the research protocol being followed correctly.

“Stopping rules” for research risks or positive findings

Data integrity:

Ensures that data are collected in a valid fashion

Guards the data against “unblinding” of participants or investigators;

A trial is “double blind” when neither the participant nor the researcher
know what group (experimental or control) the participant is in.

This eliminates possible confounds stemming from expectations
participants / researchers may have.

The DSMB is entrusted with all the codes for experimental groups and
“unblinds” participants and investigators only when the trial is over.
Foundations of
Research

39
The Women’s Health Initiative; a case study of
“stopping rules” for a clinical trial.
1980s-90s: Millions of women use Hormone Replacement
Therapy of estrogen plus progestin (E+P) to relive
menopausal symptoms.
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

1991: NIH begins a study after observational data suggest
that women using hormones have lower rates of heart
disease.

2002: The E+P part of the Initiative is stopped early after women show
higher rates of heart attack, stroke and breast cancer.

Millions of women abandon hormones overnight.

Protect vulnerable
participants & maintain
confidentiality
 An ongoing trial examines women taking estrogen only (without progestin).

2004: The estrogen only study is stopped one year early: participants show
fewer breast cancers and only small increased risk of stroke.
Foundations of
Research


The Women’s Health Initiative; a case study of “stopping rules”
for a clinical trial.
1980s-90s: Millions of women use Hormone Replacement
Therapy of estrogen plus progestin (E+P) to relive
menopausal symptoms.
1991: NIH begins a study after observational data suggest
that women using hormones have lower rates of heart
disease.
40
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants & maintain
confidentiality

2002: The E+P part of the Initiative is stopped early after women show
higher rates of heart attack, stroke and breast cancer.

Millions of women abandon hormones overnight.
 An ongoing trial examines women taking estrogen only (without progestin).

2004: The estrogen only study is stopped one year early: participants show
fewer breast cancers and only small increased risk of stroke.
Foundations of
Research
How do we separate self-interest & political
pressure from science?



41
Thought questions: Monitoring for safety
How much does military or corporate (pharmaceuticals,
tobacco..) funding for research distort scientific findings?
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants & maintain
confidentiality
How sensitive should scientists be to political or social pressures
around their research, e.g.,

sexual behavior

climate change

stem cells

evolution

gun risks

economics & social policy
How responsible are scientists for the social impact of their findings?
 E.g., negative portrayal of social “out-groups”
 The use of empirically validated techniques for unethical practices
Foundations of
Research
42
Vulnerability to coercion in research
What is coercion in research?
Enrollment:
 Joining a study that a reasonable person would see as harmful or
exploitive.
Continued participation:
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants &
confidentiality
 Not recognizing harm or exploitation that emerges once the research
begins;
 Recognizing harms but not having the psychological or physical
capacity to withdraw.
What makes someone vulnerable to coercion in
research?

Cognitive: Low capacity to think about and provide informed consent
for participation:

Children, older adults;

Dementia or cognitive limitations, mentally ill, drug users.
Foundations of
Research
43
Vulnerability to coercion in research
What makes someone vulnerable to coercion in
research?

Cognitive

Authority: liable to authorities who have a vested
The Common Rule

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

interest in your participation.

Prisoners:
•

Protect vulnerable
participants &
confidentiality
Even a small financial reward or the possibility or earlier parole can be
extremely coercive.
Medical patients, students and others with a Dual Role:
•
Patients / students may feel obligated to participate in a study
conducted by their physician / Professor.
Foundations of
Research
44
Vulnerability to coercion in research
The Common Rule
What makes someone vulnerable to coercion in
research?

Cognitive

Authority

Deferential: participation due to deferential attitudes or

Minimize risks

Reasonable risks

Equitable recruitment

Informed consent

Document consent

Monitor for safety

Protect vulnerable
participants &
confidentiality
cultural pressure rather than actual willingness.

In the 1980s – 90s the international gay community was deeply
threated by HIV/AIDS.

Men were very willing to join clinical (e.g., HIV vaccine) trials


•
From social responsibility, to help stop AIDS
•
From the Therapeutic Misconception that they would directly benefit.
This deferential attitude could create vulnerability to participating in
high risk research (although there is not evidence that happened).
Medical: selected due to serious health-related condition for which
there are no satisfactory remedies.

Poor / disadvantaged: lacking important social goods – money or
health care – provided via research participation.
See discussion above
Foundations of
Research
The Common Rule
The Common Rule

Minimize risks

Risks must be reasonable

Recruit participants equitably

Informed consent

Document consent

Monitor for safety

Protect vulnerable participants &
maintain confidentiality
45
Foundations of
Research
The Common Rule
The Common Rule
 The common rule protects
participants during the process of
research.

Minimize risks

Risks must be reasonable
 Researchers also have

responsibility
Recruit participants
equitably for the use of their
results.

Informed consent
 We discussed the problems with a negative

portrayal of research populations when
Document consent

Monitor for safety
 The next case study shows the extreme of

results are published.
use of research results.
Protect vulnerableirresponsible
participants
& maintain confidentiality
46
Foundations of
Research
A case study of the unethical use of
scientifically validated techniques.
47
The American Psychological Association [APA] actively contributed to
the CIA program of “enhanced interrogation” (torture) until 2008.

In 2014 the Senate released a report
describing CIA “enhanced interrogations”
during the Iraq war.

They concluded that the CIA engaged in
torture, and that the program was
unsuccessful.

It did not contribute useful intelligence;

Therefore it had no justification.

It had been widely known that APA-sanctioned
Psychologists had helped design and
administer the interrogation program.

In 2015 an ethics panel of Psychologists and
human rights activists issued a report on the
deep involvement of the APA in the Bush era
torture program.
Click for NYTimes.com overview.
An American soldier patrolling outside Abu Ghraib prison in 2005.
The public disclosure of images of prisoners being abused there
prompted debate about the way the United States was treating
detainees. Credit John Moore/Getty Images, New York Times.
Foundations of
Research
48
Case study of unethical use of scientifically validated techniques.

Beginning in 2003, the APA worked closely with CIA
Psychologists James Mitchell and Bruce Jessen to
develop “enhanced interrogation” (I.e. physical and
psychological torture) used at Guantanamo Bay and
other “Black sites” around the world.

APA worked with the Bush White House, State
Department and CIA to tailor it’s ethical standards to
allow Mitchell and other Psychologists to develop and
conduct the CIA interrogation program.
Mark Wilson/Getty Images
More
than a decade after George J. Tenet, then the
From the New Yorks Times article.
C.I.A. director, signed a secret order suspending the
agency’s use of enhanced interrogation techniques,
the American Psychological Association’s actions
are coming under scrutiny.
 The White House actually help write a 2005 APA policy statement allowing Psychologists’
assistance in “enhanced interrogation”. The secretive committee that prepared the
document was comprised primarily of Military Psychologists.
 Dr. Martin Seligman, past president of APA, contributed to the CIA effort by presenting his
theory of Learned Helplessness as a method of extracting information.
 From a 2004 CIA memo : "The goal of [harsh] interrogation is to create a state of learned
helplessness and dependence conducive to the collection of intelligence.”
Foundations of
Research

Case study of unethical use of scientifically validated techniques.
49
The American Psychiatric Association & American
Medical Associations refused to endorse their
members’ involvement in interrogations:


The CIA
were linked
to torture & theory
unlawful
 interrogations
The misuse
of scientific
detention;
and data can have very serious
As helping professions, the Psychiatric and Medical
consequences.
communities refused to sanction or participate in torture.
From the New Yorks
Times article.
 In it’s 2005 report, the APA ethics committee declared
that, as a research organization, it was exempt from
the ethical standards of “helpers”.

This resembles the Bush Administration's declaration that those held at
Guantanamo, Abu Graib and elsewhere were “detainees” in a battle zone, rather
than “prisoners of war”, so the Geneva Convention guidelines for the ethical
treatment of prisoners did not apply.

After years of boycotts and protests, in 2008 APA finally altered its stance and
accepted the “do no harm” ethical stance of helping professions.
 See a chilling documentary on the American Psychological Association’s adoption
of torture policy here.
Foundations of
Research

Case study 2: Researchers’ larger ethical
responsibility
What responsibility does a researcher have
for the eventual use of a discovery?

What if a finding that may do great good may
also do great harm?

How does the scientific community – and should
the scientific community – control that?
Matt Edge for The New York Times ˆ
Dr. Jennifer A. Doudna. Three years
ago, she helped make one of the
most monumental discoveries in
biology.
50
Foundations of
Research

51
Case study 2: Researchers’ larger ethical responsibility
Click the image for an NYT piece about Dr. Jennifer
Doudna, who discovered an easy method to alter
DNA.




She discovered an easy easy way to alter existing –
or insert new - DNA sequences in the genome.
These alterations would then be passed down to
successive generations.
This gene manipulation may someday be able to
cure genetic diseases.
It may also lead to “custom” genetic tampering of
embryos, to create physical – and psychological –
characteristics parents want in their children.
Matt Edge for The New York Times ˆ
Dr. Jennifer A. Doudna. Three years
ago, she helped make one of the
most monumental discoveries in
biology.
 The Chinese have already begun research in primate embryos.
 This is particularly troubling, given the early stage of the research and the
potential consequences for a commercial use of gene splicing to modify
embryos.
 Click here for a discussion of attempts to ban or limit this research.
Foundations of
Research

52
Case study 2: Researchers’ larger ethical responsibility
Click the image for an NYT piece about Dr. Jennifer
Doudna, who discovered an easy method to alter DNA.


She discovered an easy easy way to alter existing –
Dissemination
of a technique
to change something
or insert
new - DNA sequences
in the genome.
as fundamental as the human genome is potentially
These
alterations would then be passed down to
dangerous.
successive generations.
 At this stage the research is preliminary, so its


This gene manipulation may someday be able to
application would be particularly irresponsible.
cure genetic diseases.
 also
Even
astothe
research
develops,
errors
It may
lead
“custom”
genetic
tampering
of may lead to
Matt Edge for The New York Times ˆ
dire
results.
embryos,
to create
physical – and psychological –
Dr. Jennifer A. Doudna. Three years
ago, she helped make one of the
characteristics
parents
their children.
 However,
any want
new in
technology
– physical, electronic,
most monumental discoveries in
biology.
or biological – seems to always find a market.
 The Chinese have already begun research in primate embryos.
 So, should researchers “go there” and work on topics
 This is particularly
troubling,
given the
early stage of the research and the
such as gene
tampering
at all?
potential consequences for a commercial use of gene splicing to modify
embryos.
 Click here for a discussion of attempts to ban or limit this research.
Foundations of
Research
53
Misleading, biased or fraudulent results.
The Common Rule
 Beyond potential harm to



participants, social groups or
society at large, a key form of
research ethics is honesty in
Risks must be reasonable
gathering and reporting results.
Minimize risks
Recruit participants equitably
 Behavioral research has been

Informed consent plagued by…

Document consent Questionable or tenuous results, that


Monitor for safety
cannot be repeated (replicated) by
others,
Intentional (if thoughtless) biases,
Protect vulnerableparticipants
 More rarely, outright fraud.
& maintain confidentiality
Foundations of
Research
False Positives: questionable, tenuous,
non-reproducible results.
Quick note:
Any research study begins assuming the Null Hypothesis.
Negative finding:
finding any ‘results’ are by chance alone; there in nothing going on…)
(Negative
If the results are strong enough, the investigator can Reject the Null
Hypothesis.
finding the results are not just due to chance or a confound;
(Positive finding:
the hypothesis is confirmed, the theory is supported)
For obvious reasons positive results are the bread and butter of
science, negative Results are considered to have little value.
A false positive is a positive result due to to error, biases, or subtle
(…not so subtle…) fraud.
It appears that science generally, and the behavioral sciences in
particular, are rife with false positives
54
Foundations of
Research
55
False Positives
 Inaccurate or non-reproducible (false positive) results often stem from
confirmatory bias among investigators, sloppy work, or inappropriate
statistics.
 Replication is a process where a study is repeated exactly, to ensure the
results were not due to a fluke or the specific conditions in one lab (i.e., an
artifact).
 Converging studies address a topic using different methods or measures,
to ensure the results are not simply due to the general method used.
 Research journals (and grant awards) strongly favor new, exciting or
innovative results. So, few replicating or converging studies are performed;
it can be difficult to get them published.
 A lot of work then goes ‘unchecked’ – and replicating studies that are
conducted often fail to reproduce the original result.

The Reproducibility Project attempted to rigorously replicate 100 Ψ studies from
2008. Over 60% showed far weaker (or no) results when reexamined.
Foundations of
Research
56
False Positives
Positive Results Bias
 A 2010 study in PloS ONE
showed over 70% of
published articles in all
science disciplines to have
positive results.
 Psychiatry and Psychology
articles show > 90%
positive results!
 It is not remotely plausible
that > 90% of our
hypotheses are “true” and
actually supported by
objective research.
Data from: Fanelli D (2010) “Positive” Results Increase Down the Hierarchy of the Sciences. PLoS ONE
5(4): e10068. doi:10.1371/journal.pone.0010068
 Positive results bias can lead tenuous or simply false “findings” to be
published, making our empirical base generally suspect.
Foundations of
Research
57
The varieties of research cheating
1. Tweaking
2. Stage managing
3. Cherry picking
4. P-hacking
5. Simple faking
Foundations of
Research
Misleading, biased or fraudulent results.
58
1. “Tweaking” study designs to ensure positive results;
 Ideally, a study is designed to test the hypothesis:
 The null hypothesis and the alternate hypothesis (negative & positive results)
should each have a fair chance of emerging.
 Unfortunately, many studies are conceived less to test a hypotheses than to
prove or confirm one.
 Confirmatory bias can induce both intentional and unintentional biases in
favor of positive results.
 This can be a consequence of a strong theory:
 The researcher is convinced (based on previous studies, with or without clear evidence)
that the theory is correct.
 The point of the study is then less to test whether the theory is correct
…than to find procedures that will empirically confirm it.
Foundations of
Research
Misleading, biased or fraudulent results.
59
1. “Tweaking” study designs to ensure positive results;
 Ideally, a study is designed to test the hypothesis:
 Unfortunately, many studies are conceived less to test a hypotheses than to
prove or confirm one.
 Confirmatory bias can induce both intentional and unintentional biases in
favor of positive results.
 This can be a consequence of a strong theory:
Researchers may repeat experiments, tweaking one element
or another – the setting, the measurements, statistical
analyses, participants – until they get the expected result.
This powerful confirmatory bias is often not recognized as a
problem.
Foundations of
Research
Misleading, biased or fraudulent results.
60
1. “Tweaking” study designs to ensure positive results;
 Ideally, a study is designed to test the hypothesis:
 Unfortunately, many studies are conceived less to test a hypotheses than to
prove or confirm one.
 Confirmatory bias can induce both intentional and unintentional biases in
favor of positive results.
 This can be a consequence of a strong theory:
Confirmatory bias in corporate drug trials:
In analyses done in 2009 & 2010, 97.4% of drug trials supported by the
pharmaceutical industry reported positive findings.
Only 68.7% of those that weren't supported by drug manufacturers were
as encouraging.
Foundations of
Research
Misleading, biased or fraudulent results.
2. Overly “Stage Managing” the experimental setting.
 Many Psychology experiments are “stage managed”:
 Specific (manipulative?) instructions or information;
 Stimuli in the environment…

Other people may model a certain behavior;

Your Week 3 reading discussed alcohol experiments that stage a “bump”
to test hypotheses about how participants will react;

The Milgram experiments “staged” a learning task.
 Narrowing participants’ choices in how they can react to a stimulus;

Binary (“true”/”false”) measures that disallow nuanced responses.
 Stage managing can be crucial to testing behavioral hypotheses, by
presenting more realistic stimuli.
 Experiments that overly stage manage can exert a powerful confirmatory
bias by driving only certain responses.
61
Foundations of
Research
Misleading, biased or fraudulent results.
3. “Cherry Picking” results; publishing only those that “work”.
 Ideally, a study sets out a clear, a priori hypothesis.

The hypothesis (or methods section) clearly specifies the variable(s) to
be examined and the statistical approach.

The analysis and discussion address only the target variables or
measures specified in the hypothesis.
 This ideal process is often not followed, particularly in the social sciences.

Researchers may examine a number of variables and write up only
those that showed an “effect”;

Researchers may conduct multiple studies and publish only those
providing confirmation of the hypothesis…
62
Foundations of
Research
Misleading, biased or fraudulent results.
63
“Cherry Picking” results; Selective choice of variables:
 Data sets often contain multiple variables.
 Investigators can cheat by testing a range of variables,
identifying the specific measures that support the hypothesis,
publishing only the results for those measures,
and even re-writing the hypothesis post-hoc to make it seem as though the
specific measure that “worked” was the one they had in mind all along.
o For example, Bohannon’s fake chocolate study assessed multiple possible
outcomes of a diet program; see the discussion below.
 This form of fraud is not always intentional or malicious;
 With a complex study we want to explore all aspects of the data.
 However, our bias toward positive results will draw attention to only
some outcomes.
 As we saw in the Fen Phen scandal, company-sponsored researchers
occasionally simply ignore negative outcome variables.
Foundations of
Research
“Cherry Picking” results; Multiple studies
 Pharmaceutical companies had been particularly implicated in this:
o Companies would run multiple trials of a new drug, and submit only the successful
outcomes to the FDA for approval.
o The Food & Drug Administration (FDA) now requires submission and access to all
drug trials.
 This has been called “the file drawer problem”; journals have little interest
in negative results, so only positive outcomes get submitted for publication.
 Researchers rarely attempt to publish negative results.
 This is recently changing:
o Researchers are increasingly aware of positive results bias, and journals are
becoming more accepting of negative results.
o Sophisticated statistical techniques allow writers or reviewers to estimate the
extent of the file drawer problem in a research area
64
Foundations of
Research
“Cherry Picking” results; Multiple studies
 Pharmaceutical
companies
been particularly
 To help
resolvehad
a number
of these implicated in this:
issues…
o Companies
would run multiple trials of a new drug, and submit only the successful
outcomes to the FDA for approval.
 Positive results bias & cherry picking,
o The Food & Drug Administration (FDA) now requires submission and access to all
 The file drawer problem,
drug trials.
 Difficulty
replicating
studies,
 This has been
called “the
file drawer
problem”; journals have little interest
in negative results,
so only
positive biases…
outcomes get submitted for publication.
 Potential
confirmatory
 …researchers
aretoincreasingly
being
more
 Researchers
rarely attempt
publish negative
results.
transparent, and making all the data
 This is recently
changing:
underlying
their research publicly
available.
o Researchers
are increasingly aware of positive results bias, and journals are
becoming
accepting
of negative
results.
 Seemore
a New
York Times
report
here.
o Sophisticated statistical techniques allow writers or reviewers to estimate the
extent of the file drawer problem in a research area
65
Foundations of
Research
66
Head ML, Holman L, Lanfear R, Kahn AT, Jennions MD (2015) The Extent
and Consequences of P-Hacking in Science. PLoS Biol 13(3): e1002106.
doi:10.1371/journal.pbio.1002106 Click for original article
4. P-hacking and False Positives
Dr. Megan Head and colleagues examined 100,000 research papers spanning
scientific disciplines including medicine, biology and psychology.
 Click here for a summary.
 Head’s study found a high number of p-values that were only just over the
traditional threshold that most scientists call statistically significant.
 Quotes from the article (above):
 “This suggests that some scientists adjust their experimental design, datasets or
statistical methods until they get a result that crosses the significance threshold”.
 “They might look at their results before an experiment is finished, or explore their
data with lots of different statistical methods, without realizing that this can lead to
bias.”
 “Journals, especially the top journals, are more likely to publish experiments with
new, interesting results, creating incentive to produce results on demand.”
 The pressure for new, exciting results works against longer time frames in
studies, which help us really understand, e.g., the effect of a treatment. It
almost rules out studies that attempt to replicate previous research.
Foundations of
Research
Misleading, biased or fraudulent results.
67
 Simple faking;
 Unfortunately, it is not that difficult to fake a study and garner substantial
media attention of a phony (even silly) “finding”.
 How to create a fake study and get world-wide attention: John
Bohannon’s fake chocolate weight loss study.
Foundations of
Research
68
Misleading, biased or fraudulent results.
 There is substantial pressure –
and temptation – for actual
fraud.
 Career advancement – and funding
– requires finding positive results;
Diederik Stapel, a Dutch
social psychologist,
perpetrated an audacious
academic fraud by making
up studies that told the
world what it wanted to
hear about human nature.
Click for New York Times
overview.
Koos Breukel for The New York Times
 Experiments must “work” to be published, to get drugs licensed, to apply for
further funding…
 As a consequence, U.S. Behavioral research shows a strong bias toward
positive results.
 A case study of research fraud
 Simple faking; Deitrik Stapel successfully faked 20 years of data (click image).
 Stapel perpetrated the largest fraud in modern behavioral science.
 He made up results that were plausible and interesting (“sexy”, in his
word), but close enough to real results to not be questioned
 He was aided by trust and lack of strong oversight in the research industry.
 Over a 20-year fake research program he published 55 research papers,
11+ Doctoral dissertations, and secured substantial research funding.
Foundations of
Research
69
Misleading, biased or fraudulent results.
 Corporate research: major funder of junk science.
 Companies spend significant amounts to produce junk
science favorable to their products.
 One study shows that research funded by the soft drink
industry is 5 times (!) more likely to find no association
between diet and obesity than is publically funded research.
Click: Sugar industry pays scientists
to produce “findings” that sweet foods
are not harmful. Zoonar/P.Malyshev via
Getty Images
 Coca Cola has been particularly aggressive in this.
•
Concern over obesity has led people – particularly parents – to lessen their consumption of soft
drinks, meaning that Coke is losing revenue.
•
Despite clear evidence that diet (particularly sugary drinks) is far more important to obesity than is
exercise, Coke has mounted a campaign to convince the public of the opposite.
•
They have spent millions funding legitimate researchers to distort the literature and promote
exercise as central to avoiding obesity, and to promote Coke as a healthy snack.
An image from a video by the Coca-Cola Foundation. In November 2012,
the foundation announced a $3 million grant to Chicago's Garfield Park
Conservatory Alliance. The grant was intended to establish a wellness
program.
Shortly after, a bill to tax sugary soft drinks was dropped by the Chicago
City Council.
Click the image: NY Times overview of Coke’s funding of highly biased
obesity research. Read an editorial here
Image: NYTimes, August 9, 2015
Foundations of
Research
The consequences of biased research studies
What are the consequences of subtle (or not so subtle)
cheating in research?
 Compromised or poorly conducted research…





“Tweaking” study designs to ensure positive results,
Cherry picking only positive results,
Statistical “errors” favoring the hypothesis,
Simple faking,
Lax standards for evaluating results,
 … can lead to the appearance (or gross overstatement) of
positive results in data that actually do not support the
hypothesis.
 Evidence from journal publications shows Psychiatry /
Psychology to have a very strong “positive results bias”:
70
Foundations of
Research
SUMMARY


Overview
The “Common Rule”: core guidelines

Minimize risks

Risks must be reasonable

Recruit participants equitably

Informed / Document consent

Monitor for safety

Protect vulnerable participants & maintain confidentiality
Larger institutions have key roles in maintaining ethical
behavior

Institutional Review Boards (IRBs)

Organizations such as the APA can behave in clearly unethical
fashion

The “research industry” directly presses for questionable,
mediocre or fraudulent research.
71
72
Foundations of
Research

The Common Rule: Core
criteria for ethical research
 The Belmont Report

and the Informed
Consent document
Respect
for
persons
Beneficence
Justice
Foundations of
Research
Belmont Report
73
(CITI training)
1. Respect For Persons
 Right to exercise autonomy & make informed choices.
2. Beneficence
 Minimization of risk & maximization of social / individual benefit
How much information should participants get from a blinded, randomized trial?
See ethics of clinical trials
3. Justice
 Research should not unduly involve groups unlikely to benefit
from subsequent applications.
 Include participants of all races & both genders
 Members of target population on design & research team
 Research & researchers contribute to study population studied
 Communicate research results & develop programs/ interventions
Foundations of
Research
Optional
Optional section:
Click to go to a detailed description of the sections used in a
complete informed consent document.
I will not ask you to list these in the exam, but you will use this
for your paper.
Several key items you should know:
•
The research is completely voluntary
•
Participant must understand each element of the study.
•
Participant must willingly sign the informed consent
document.
74
Foundations of
Research

Why am I being asked to participate in this research?



The informed consent document
..who is being recruited or selected?
Why is this research being done?
Overview Box:
Brief descriptions..




procedures,
purposes,
potential risks & benefits,
potential outcomes.

What is the purpose of this research?

What procedures are involved?

What are the potential risks and discomforts?

Are there benefits to taking part in this research?

What other options are there? ...What happens if I decline
participation?
75
Foundations of
Research
Informed consent elements, 2

Will I be told about new information that may affect my decision to
participate?

What about privacy and confidentiality?

What if I am injured as a result of my participation?

What are the costs for participating in this research?

...e.g., for services, etc.

Will I be reimbursed for any of my expenses for participation in this
research?

Can I withdraw or be removed from the study?

Who should I contact if I have questions?

Signature of participant or legally authorized representative
76