Protecting Human Subjects at KUMC

Download Report

Transcript Protecting Human Subjects at KUMC

KUMC Quality Assurance
Program for
Human Research
Karen Blackwell, MS, CIP
Director, Human Research Protection Program
Overview

Rationale for proposing a QA program

Activities of the QA Task Force

Task Force recommendations

Next steps
Quality Assurance…





Support and education for investigators
Routine on-site reviews of study records
Preparation for external audits
For-cause audits, when required
Feedback to the overall HRPP
Rationale for a QA Program





Reflect our commitment to excellence
Coordinate efforts within KUMC
Prevent compliance violations
Meet contractual and fiduciary duties
Address known challenges
EVC’s Charge to the Task Force







Examine model programs
Identify key individuals and groups
Optimize existing resources
Develop standard operating procedures
Establish reporting paths
Develop a communication plan
Report back by September 1st
QA Task Force Members
April –August 2009







Ed Phillips
Jeff Reene
Paul Terranova
Marge Bott
Gary Doolittle
Patty Kluding
Greg Kopf





Karen Blackwell
Jo Denton
Diana Naser
Becky Hubbell
Monica Lubeck
Model Programs








University of Pittsburgh
Partners HealthCare System
University of Michigan
Emory University
Indiana University
University of California – San Francisco
Baylor College of Medicine
Children’s Hospital of Boston
Task Force Recommendations

Overall philosophy for our program

Key components of a QA program

Resources, milestones, timelines

Leadership and oversight
Key Components






Support from institutional leadership
Clear delineation of roles
Transparent criteria for study selection
Standard operating procedures
Lines of authority to report audit findings
Methods to translate findings into
education and support for investigators
Philosophy of the QA Program

Partnership

Focus on education and assistance

Collegial approach

Soliciting investigator feedback
Appendix C
Appendix D
Evaluating Resources
Target for
% of
studies
reviewed
Routine
Reviews
per Year
FTEs
Annual
Routine
Reviews
per FTE
Team or
Single
Institution 1
5%
None.
Currently
focusing on
for-cause
audits
3.5
n/a
Teams when
needed
Institution 2
2%
1st year: 30
2nd year: 70
3rd year: 85
2.3
36
Single
reviewer
Institution 3
8%
100 – 110
3.5 and
Hiring 2
more FTEs
31
Teams of 2
or 3
Institution 4
10%
200+
5
40
Single
reviewer
Recommended Milestones

Aim for 5 – 8% of our 1020 studies

5% = 51 reviews annually; 8% = 81 reviews

1st year 2.5%; 2nd year: 5 – 8%

Minimum staff of 2 FTEs

As research volume grows, adjust staff to
maintain the 5 – 8% target
Input and Oversight



Model programs recommended ongoing
faculty input
Guidance from the Clinical Research
Advisory Committee
Oversight by executive leadership:



Vice Chancellor for Administration
Vice Chancellor for Research
RI Executive Director
Timelines
Task Force Report to EVC
Presentations to leadership
and investigators
Final versions of Standard
Operating Procedures
Solicitation of investigators
for voluntary reviews
September 2009
October – December 2009

Program Launch
January 2010

Six-month program
evaluations
June 2010, Jan 2011, June 2011

Annual reports to the EVC
and CRAC
January 2011, January 2012




November – December 2009
November – December 2009
Implementation:
Study selection
On-site review
Feedback
Corrections as needed
Trend analysis
Study Selection
Tier 1
Federally or internally funded
 Moderate to high risk
 IND/IDE holders
 KUMC role as coordinating center
 Vulnerable populations
 COI

Tier 2 (other studies)
Review Process

PI is notified

Review is scheduled ~ 2 weeks

On-site review
Routine reviews, 20 – 30% of records
 For-cause, up to 100%

Scope of the Review



IRB-approved documents
Signed consent forms
Study data, e.g.,
Inclusion/exclusion decisions
 Outcomes of assessments and procedures
 Source documents



Adverse events or problems
Drug/Device accountability
Common Findings at Other Sites


Missing correspondence or approvals
Informed consent issues
Expired or invalid consents
 Not dated or signed correctly
 Consent by unauthorized persons


Incomplete study records
Serious Findings

Protocol non-compliance

Inadequate study records

Unreported adverse events or deviations

Lack of drug/device accountability

Unapproved research
Observations and Corrections




Exit interview
Draft report to the PI, within 7 days
PI responds with corrections of errors,
clarifications, corrective action plan (if
needed), within 14 days
Final report to the PI and to the HSC
Reporting Findings



All final reports go to the HSC office
Minor non-compliance is reviewed by the
chair
Potentially serious non-compliance goes
to the convened HSC
Evaluate corrective action plans
 Follow-up as appropriate

Getting Feedback

Exit interviews

Survey to investigators

Input from the CRAC

Cumulative results impact overall
program
Feedback?
Questions?