KU Grand Rounds - University of Kansas Medical Center

Download Report

Transcript KU Grand Rounds - University of Kansas Medical Center

Negotiating Ethical Challenges in
Medical Settings:
Juggling Porcupines:
Gerald P. Koocher, Ph.D., ABPP
Simmons College
www.ethicsresearch.com
2
Special challenges for ethical
practice in health care settings:
• How do health care delivery and
research trigger particularly complex
ethical challenges?
• Consider some illustrative examples.
• Promote a culture that enhances
ethical practice and reduces risk.
Ethical Fundamentals
Beauchamp & Childress (2001) Principles of Biomedical Ethics 5th Edition
•
•
•
•
•
•
•
Autonomy (give people choices)
Beneficence (do good)
Nonmaleficence (don’t do bad)
Justice (behave fairly)
Fidelity and Responsibility (demonstrate trustworthiness)
Integrity (show honesty and truthfulness)
Respect the Rights and Dignity of Others
3
More Ethical Fundamentals
Koocher & Keith-Spiegel (2008)
Ethics in Psychology and the Mental Health Professions 3rd Ed.
• Fidelity – loyalty, dependability
• Pursue excellence
• Accept accountability
4
5
Key ethical challenges in medical
settings
• Special duties owed to vulnerable others
▫ Sicker patients and more complex treatments
• Opportunities for boundary and role confusion
▫ Conflicts of interest and temptations
• Institutional culture and peer response
▫ Hierarchy and tone setting
 Someone has to coordinate care for optimal results
▫ Tolerance for error
 Learning from the “near-miss” and disclosures with
apology
▫ Attitudes toward scientific dishonesty
 Colleagues as the best defense
6
Creating a culture that encourages
reporting of human error
• Near-miss recognition and reporting
systems (near-miss, zero cost)
• Creating a safe climate for sharing
concerns in a professional manner and
context (engaged colleagues)
7
The Near-Miss
• An unplanned event that did not result in injury,
illness, or damage - but had the potential to.
• Only a fortunate break in the chain of events
prevented an injury, fatality or damage.
• Human error commonly serves as an initiating
event, but a faulty process or system invariably
permits or compounds the harm.
• Focus on improvement that reduce the chance of
error or system failure.
8
9
Anorexia vs Confirmatory Bias
• Teri Slim had a petite and slender build, but seemed
unusually thin to her father when she donned a
bathing suit just prior to her 14th birthday. Her
mother, a psychiatric nurse agreed. They took
Terri for evaluation at a large medical center near
her home and the staff there, unprepared to treat
anorectic adolescents, referred her on to a
specialized pediatric hospital 150miles away. The
psychiatric admission evaluation at the second
hospital confirmed the diagnosis of anorexia, and
admitted Teri to their inpatient child psychiatry
unit for treatment.
10
• The hospital staff easily identified family
stressors that might account for Teri's
emotional problems. Her parents had recently
divorced, her father had lost his job as business
executive, and her mother (the nurse) who lived
in another state, allegedly had a serious
addiction problem. At the end of 2 months of
treatment, Teri remained malnourished and
had made “no progress” in treatment despite
the administration of supplements using a
nasogastric tube. She occasionally vomited up
the Ensure resulting in behavioral restrictions
for “acting out.”
11
• The staff contemplated initiating intravenous
feeding in the face of her progressive weight
loss and prepared to transfer Teri to a medical
for placement of a venous feeding line. Only
then did a senior pediatrician sent to to the
psychiatry unit to screen her for transfer ask,
“Has anyone evaluated her for Crohn's
disease?” Several weeks later, Teri went home
from the hospital minus a segment of inflamed
intestine and taking anti-inflammatory
medication. She continued to do well in
response to the treatment for Crohn's disease.
12
You will give me a number!
• Bertram Botch, M.D., served as the chief of
neurology at a pediatric rehab hospital and often
chaired interdisciplinary case conferences.
Reporting on her assessment of a low-functioning
mentally retarded child, Melissa Meek, Ph.D.,
presented her detailed findings in descriptive
terms. Dr. Botch listened to her presentation and
asked for the child's IQ. When Dr. Meek replied that
the instruments used were developmental indices
that did not gave functional ranges but did not
yield IQ scores, Dr. Botch demanded that she
compute a specific IQ score to use in his preferred
report format.
13
14
Anything but Child’s Play…
Miller, G. (2010) Science, 327, 192-193
• In the mid-1990s Joseph Biederman and Janet
Wozniak proposed the notion that many
children with conduct disorder or ADHD
diagnoses might have “juvenile bipolar
disorder,” and proposed treating them with
medications developed for adults with
significant mood disorders
• Diagnoses jumped by a factor of 40 between
1993-2004
• Is this valid diagnosis or one driven by “Big
Pharma?”
15
Joseph Biederman, M.D.
http://www.cchrint.org/cchr-issues/the-corrupt-alliance-of-the-psychiatric-pharmaceutical-industry/
• While Chief of Pediatric Psychopharmacology at
Massachusetts General Hospital, he received research
funds from 15 pharmaceutical companies. The New York
Times reported that he earned $1.6 million in consulting
fees from drug makers between 2000 and 2007, but did
not report all of this income to Harvard University
officials.
• His marketing of the theory that children have “bipolar”
has contributed to the increase in antipsychotic drug
sales for pediatric use in the United States—today
estimated at 2.5 million children.
16
• In March 2009, in newly released court
documents, he reportedly promised drug maker
Johnson & Johnson in advance that his
studies on the antipsychotic drug Risperidone
would prove the drug effective when used on
preschool age children.
• In an e-mailed statement, Dr. Biederman told
the Times, “My interests are solely in the
advancement of medical treatment through
rigorous and objective study,” and he said he
took conflict-of-interest policies “very seriously.”
http://www.nytimes.com/2008/06/08/us/08conflict.html
17
Charles Nemeroff, M.D.
http://www.cchrint.org/cchr-issues/the-corrupt-alliance-of-the-psychiatric-pharmaceutical-industry/
• While Chairman of Psychiatry and Behavioral
Sciences at Emory he received $960,000 from
GlaxoSmithKline, (GSK) between 2000-2006,
but disclosed only$35,000 to Emory. Between
2000 and 2007, he earned more than $2.8
million from various drug makers but failed to
report at least $1.2 million. He signed a letter
in 2004 promising Emory that he would earn
less than $10,000 a year from GSK but on the
same day lectured for GSK at a hotel earning
$3,000 of what would become $170,000 from
the company.
18
• In 2006, he stepped down as editor of
Neuropsychopharmacology after publishing a
favorable review of the vagus nerve stimulation
device, manufactured by Cyberonics, for which he
was a paid consultant.
• In 2003, he coauthored a favorable review of three
therapies in Nature Neuroscience failing to mention
his significant financial interests in these, including
owning the patent for one of the treatments—a
lithium patch.
• He resigned his position at Emory in 2008 and was
barred by NIH from receiving federal research funds
for two years.
Commercial Hazards
• Pharmaceutical
Sponsorship:
Controlled
Seduction?
▫ Research
▫ Lectures
▫ Meetings
▫ Junkets
▫ Trinkets
19
20
Don’t doze off!
This applies to clinicians too.
21
Proximal Cause & Scientific Dishonesty
• Researchers are most likely to
intentionally engage in dishonest
acts if :
▫ their commitment to discovering the
truth (patient care, or other core values)
is not firm or becomes compromised
through rationalization,
▫ if the potential for reward exists, and
▫ if they regard the chances of detection
as low.
22
• For example, an investigator may feel convinced that
falsifying data is acceptable because
▫ the actual results would turn out as expected
anyway,
▫ taking a shortcut seems necessary to meet an
important deadline, and
▫ the chance of uncovering forged data seems nil.
• Here some form of situational constraint stands as
the primary barrier to intentionally committing a
dishonest act. Colleagues in a position to observe or
learn about the misbehavior constitute the principal
source of such constraint. These same colleagues also
provide the most readily available resource for
preventing and correcting unintentional errors.
23
What is “Bad Science” Anyway
• The big three are FF&P
▫ Fabrication, Falsification, and
Plagiarism
 Fabrication is usually in the form of
“dry lab” data that are simply invented.
 Falsification can take several forms.
 Actual data can be “smoothed,” or “cooked” to
approach more closely the desired or expected
outcome.
 Collected data points can be dropped (“trimmed”)
to delete unwanted information.
24
The Bozo Factor
• Sometimes ineptitude or incompetence
can result in inappropriate design, poor
or biased sampling procedures, misused
or wrongly applied statistical tests,
inadequate record-keeping, and just
plain carelessness. Even though there
may be no intent to deceive, inaccurate
information can also seriously damage
the research record.
25
• One might assume (or hope) that such
inaccuracies, purposeful or not, will be
discovered. But don’t cannot count on it.
Whereas errors in alleged scientific advances are
assumed to be eventually self-correcting through
replication, funding sources typically do not
support replication research, and most scholarly
journals do not normally publish replication
studies. Thus, there is little incentive for
researchers to repeat projects, especially
expensive and complex ones.
26
Difficulties in Detection
• Most highly publicized data scandals
have occurred in biomedical research
laboratories. No one knows for sure
whether the incidence is higher in
biomedical science than in the social and
behavioral sciences, or whether it is
simply easier to detect fraud in
biomedicine.
27
• Most social and behavioral research does
not involve chemical analyses, tissue
cultures, changes in physical symptoms,
invasive procedures, or similar “hard”
documentation. Social science data often
take the form of numerical scores from
questionnaires, psychological
assessments, performance measures or
qualitative data based on interviews or
behavioral observations.
▫ The actual research participants have long
since gone, taking their identities with them.
Such data are relatively easy to generate,
fudge, or trim.
Rogues’ Gallery
• South Korean scientist, Woo Suk
Hwang gained notoriety when he
falsely claimed to have successfully
cloned close to a dozen human
embryos. All of Hwang’s previous
accomplishments, including
Snuppy the allegedly-cloned
Afghan hound), are now viewed
with skepticism.
28
Rogues’ Gallery
• Dr. Eric Poehlman (University of Vermont) became the
first academic scientist in the United States to serve
prison time for misconduct (not involving fatalities) and
a lifetime ban on federal research funding. Poehlman
published articles containing bogus data and submitted
falsified grant applications that brought in almost 3
million dollars in federal grant money since the early
1990s.
29
30
Rogues’ Gallery
• Paul Kornak was found guilty of
criminally negligent homicide for
falsely representing results of blood
chemical analyses in a chemotherapy
study. One participant who should
have been excluded from the study
died as a result.
• Kornak received a 71 month federal
prison sentence and had to pay over
$600,000 restitution to two drug
companies and the VA. He is also
barred for life from federal research
funding.
31
NIH Grant No. R01 NS049573 [NINDS/ORI]
Gerald P. Koocher, Principal Investigator
Patricia Keith-Spiegel and Joan Sieber, Co-Investigators
32
NIH focuses on FF&P, but there’s
more…
• We surveyed more than 5,000 names in the NIH
CRISP data base
• 2,599 respondents reported 3,393 accounts of
suspected wrongdoing and other errors related
to the conduct of research.
• Only 406 of those responding stated that they
had no incidents to share.
33
Type
Number of
Incidents
Percentage
Fabrication/falsification
608
17.3%
Questionable publication practices
(e.g., disputed authorship credits)
Plagiarism
601
17%
462
13.1
432
12.3%
420
11.9
Difficult or stressful work
environment (e.g., mistreatment of
subordinates, sexual harassment
or other forms of exploitation)
Incompetence (e.g., poor research
design or inappropriate analysis,
insufficient skills relative to the
study technique)
34
Type
Number of
Incidents
Percentage
Carelessness (e.g., cutting corners,
sloppy record keeping)
334
9.5%
Intentional bias (e.g., rigging a sample
or method to favor a certain outcome,
Failure to follow the rules of science
(e.g., violating human research
participant requirements, sidestepping
or ignoring IRB directives)
Inadequate supervision of research
assistants
*Respondents could report more than
one type of incident.
176
5%
169
4.8%
136
3.9%
Total =
3525
100%
35
What Risks Materialized and Who Got Hurt?
• In 1,169 (42%) of the incidents, participants experienced
no negative consequences as a result of their
intervention.
• Another 296 participants reported an elevation in status.
• However, almost half of our interveners reported
suffering to some degree, although a large portion
recounted only emotional distress as opposed to any
damage to their careers or social standing.
• Some respondents reported serious consequences, such
as feeling shunned, forced to leave a job, or losing
previously close friends or allies. A few even feared law
suits, although none ever materialized.
36
• Despite personal risks, two out of three survey
participants claimed to have taken it upon themselves to
attempt to prevent or correct a wrong in progress, or to
minimize damage that had already occurred.
• Very few participants initially reported their concerns to
another entity, opting to attempt to informally correct
the problem or achieve damage control on their own or
in partnership with other colleagues.
• The most common reasons offered for acting included a
commitment to research integrity, to avoid damaging the
reputation of oneself or one’s lab or institution, or to
prevent an associate from making a mistake.
• Almost all respondents took direct action if the
questionable act was perpetrated by their own post docs
or assistants.
37
Who Takes Action, and Does It Work?
• A binary logistic regression analysis profiled characteristics of
researchers who intervene:
Most likely to take action were those who
▫ held a higher professional or employment status than the suspected
wrongdoer
▫ had less regular interaction or involvement with the suspected
wrongdoer
▫ based their suspicions on strong evidence (i.e., direct observation or
direct disclosure the transgressor rather than second-hand accounts
or hearsay)
▫ perceived the transgression as unintentional, and
▫ held a belief that individuals have a primary responsibility to become
actively involved in maintaining scientific integrity.
▫ The vast majority of those who felt victimized or who believed that
they might suffer blame also proved likely to intervene individually or
by reporting the matter, suggesting that acts involving direct threat to
oneself will likely lead to taking some type of action.
▫ The highest rates of intervention occurred for projects described as
taking place in the context of high stress that compromised research
quality.
38
Those Who Did Not Act
• About a third of participants did not take action
regarding any incident they shared with us.
▫ The largest group revealed that they felt too remotely
involved or knew that others were already taking
action.
▫ Another third claimed they simply did not know what
to do.
▫ Reluctance to deal with a suspected offender perceived
of as difficult person or who was their superior were
other common reasons for inaction, as was an
unwillingness to act when evidence seemed
insufficient.
39
• Social relationships, job security, and status become
more salient in close working conditions. So perhaps
understandable, but also disappointing, was the
finding that those who worked closely with
suspected wrongdoers were less likely to take any
action. Thus, the best opportunity to observe wrongs
and stop or correct them appears to also be less
likely to be utilized.
• Finally, we asked if those who took no action on
their suspicions experienced lingering reservations.
Forty percent of those who did not get involved,
even though they had direct evidence of
wrongdoing, still felt misgivings, sometimes even
after many years had since passed.
40
Culture shifting
• Actively engaging colleagues with gentle
alternatives to whistleblowing
▫ Offering help
▫ Expressing concern
▫ The “Bullwinlke” approach
• Encouraging reporting of near-miss situations
• Apologizing when appropriate
41
Understanding how patients view the
care they receive
• Attending to and communicating about errors
• The power of apology
42
Creating a culture that encourages
reporting of human error
• Near-miss recognition and reporting
systems
• Creating a safe climate for sharing
concerns in a professional manner and
context