CDRH Seminar - Perelman School of Medicine at the
Download
Report
Transcript CDRH Seminar - Perelman School of Medicine at the
The Pragmatic Clinical Trial in
a Learning Healthcare System
University of Pennsylvania
8th Annual Conference on
Statistical Issues in Clinical Trials
Roger J. Lewis, MD, PhD
Department of Emergency Medicine
Harbor-UCLA Medical Center
David Geffen School of Medicine at UCLA
Los Angeles Biomedical Research Institute
Berry Consultants, LLC
1
Financial Disclosures
• Berry Consultants, LLC
– Multiple clients
• Support from
– National Institutes of Health/NINDS
– Venaxis, Inc.
• Prior support from
– Food and Drug Administration
• Octapharma AG
2
Intellectual Disclosure
• This presentation includes the intellectual work of
multiple colleagues
–
–
–
–
–
–
–
–
–
–
Derek C. Angus, MD, MPH
Don A. Berry, PhD
Scott M. Berry, PhD
Jean-Daniel Chiche, MD
Jason Connor, PhD
John Marshall, MD
Will Meurer, MD, MS
Alistair Nichol, MD
Steve Webb, MD
Leadership from the European PREPARE Consortium
(Herman Goossens, Marc Bonten and others)
– And others!
3
The Learning Healthcare System
“The nation needs a healthcare system
that learns.” (IOM 2007)
• Adaptation to the pace of change
• New clinical research paradigm
• Universal electronic health records and
clinical decision support systems
• Narrowing the research-practice divide
4
The 3048 Meter View of a
Pragmatic Trial in a LHS
Best standard
care
Randomized, adaptive,
treatment allocation
Heterogeneous
patient population
EHR Data
Ethical integrity
(consent, privacy)
Selected
Outcome Data
Adaptive algorithm with
considerations of HTE
5
Key Components
• Healthcare systems with informatics systems,
leadership, and commitment to learn
– Veterans Health Administration
– European PREPARE Consortium
– PCORI National Patient-Centered Clinical Research
Network (PCORnet)
• Multiple treatment domains and factors to be
investigated
• Flexible, adaptive trial algorithm for assigning
treatments, evaluating effects, and drawing
conclusions (“platform trial”)
6
Motivation for Adaptive Trials
• When designing a trial there is substantial
uncertainty regarding how best to treat subjects
in the experimental arm (e.g., uncertainty in
optimal dose, best duration, target population)
• This creates uncertainty in the optimal trial
design
• Traditionally, all key trial parameters must be
defined and held constant during execution
• This leads to increased risk of negative or failed
trials, even if a treatment is inherently effective
7
Motivation for Adaptive Trials
• Once patients are enrolled and their outcomes
known, information accumulates that reduces
uncertainty regarding optimal treatment
approaches
• Adaptive clinical trials are designed to take
advantage of this accumulating information, by
allowing modification to key trial parameters in
response to accumulating information and
according to predefined rules
8
JAMA 2006;296:1955-1957.
9
Adaptation: Definition
• Making planned, well-defined changes in
key clinical trial design parameters, during
trial execution based on data from that
trial, to achieve goals of validity, scientific
efficiency, and safety
– Planned: Possible adaptations defined a priori
– Well-defined: Criteria for adapting defined
– Key parameters: Not minor inclusion or
exclusion criteria, routine amendments, etc.
– Validity: Reliable statistical inference
10
The Adaptive Process
Begin Data Collection with Initial
Allocation and Sampling Rules
Analyze
Available Data
Continue Data
Collection
Stopping
Rule Met?
No
Revise Allocation
and Sampling Rules
per Adaptive Algorithm
Yes
Stop Trial or
Begin Next
Phase in
Seamless
Design
11
Why Do Adaptive Clinical Trials?
• Usual Reasons
– To avoid getting the wrong answer!
– To avoid taking too long to draw the right
conclusion
• In the setting of a learning healthcare
system
– To learn about effectiveness, and apply what
we learn, simultaneously
– To continually improve patient outcomes
12
Selected Adaptive Strategies
• Frequent interim analyses
• Response-adaptive randomization to efficiently
address one or more trial goals
• Explicit decision rules based on Bayesian
predictive probabilities at each interim analysis
• Enrichment designs
• Extensive simulations of trial performance
13
14
Response-adaptive Randomization
• Response-adaptive randomization to
improve important trial characteristics
• May be used to address one or more of:
– To improve subject outcomes by preferentially
randomizing patients to the better performing arm
– To improve the efficiency of estimation by
preferentially assigning patients to doses in a
manner that increases statistical efficiency
– To improve the efficiency in addressing multiple
hypotheses by randomizing patients in a way that
emphasizes sequential goals
– Includes arm dropping
15
Example Learning Strategy
“Burn in”
RAR:
Confirmation
RAR: Dose
Treatment
Rec.
Standard
Low Dose
Med Dose
High Dose
Start
300
400
500
600
700
800
900
1000
1100
1500
Time
16
17
Platform Trial
• An experimental infrastructure to evaluate
multiple treatments, often for a group of
diseases, and intended to function continually
and be productive beyond the evaluation of any
individual treatment
– Designed around a group of related diseases rather
than a single treatment
– Dynamic list of available treatments, assigned with
response-adaptive randomization
– Preferred treatments may depend on health system,
patient, or disease-level characteristics
18
JAMA. Published online March 23, 2015. doi:10.1001/jama.2015.2316
19
From: The Platform Trial: An Efficient Strategy for Evaluating Multiple Treatments
JAMA. Published online March 23, 2015. doi:10.1001/jama.2015.2316
Table Title:
General Characteristics of Traditional and Platform Trialsa
Date of download: 3/24/2015
Copyright © 2015 American Medical
Association. All rights reserved.
20
Platform Trial
Initial Usual Care
1st Generation “A”
2nd Generation “A”
Drug B
“A + B”
Time
21
Platform Trial
Initial Usual Care
1st Generation “A”
2nd Generation “A”
Drug B
“A + B”
Subtype A
Initial Usual Care
1st Generation “A”
2nd Generation “A”
Drug B
Subtype B
Time
22
Platform Trial Terminology
• Domain
– A domain of treatment
– Ex: ventilator management
• Factor
– One particular approach within a domain
– Ex: 6cc/kg tidal volume ventilation
• Regimen
– The assigned collection of factors from
multiple domains
23
Framing the Possible
Conclusions of a Platform Trial
• The number of regimens can be very large
– Very difficult to show an individual regimen
has a very high probability of being best within
a patient subgroup
• May draw a conclusion when
– There is a very high (or low) probability that a
given factor is included in the best regimen(s)
– This allows you to reduce the complexity of
the problem over time and improve the
treatment and outcomes of patients
24
Randomized Embedded, Multifactorial,
Adaptive Platform (REMAP) trial
•
•
•
•
•
Randomized
Embedded
Multifactorial
Adaptive
Platform
Thanks to Derek C. Angus, MD, MPH, University of
Pittsburgh Medical Center, for “REMAP”
25
Examples
27
28
Embedding the Clinical Trial
29
Research Processes
30
Response-adaptive Randomization
31
The PREPARE Consortium
• Platform foR European Preparedness
Against (Re)emerging Epidemics
– 25 million euro FP7 strategic award
• Work Package #5 – ‘PREPARE CAP’
– An adaptive trial platform to determine best
care for severe acute respiratory failure of
presumed infectious origin
• Endemic - severe CAP
• Epidemic – severe respiratory viral illness (e.g.,
H1N1)
32
The PREPARE Consortium
33
Scope of PREPARE CAP
• Simultaneously considers
– Anti-infective strategies (i.e., antibiotic choice)
– Host response modulation (i.e., steroids)
– Organ support strategies (i.e., mechanical ventilation)
• Design stratifies by different subgroups
– Shock or not
– Severe vs. moderate hypoxemia
• With five “yes/no” strata (treatments and
subgroups) there are 25 = 32 “cells”
• Additional complexity
– Add or drop factors by region, country, patient subgroup, and season (e.g., “influenza flu season”) 34
Embedding the Trial into Routine Care
• For EVERY patient who presents with severe CAP, clinical
team calls IVRS for the patient’s unique order set
35
Starting Conditions/Burn In
Regimen
Anti-infective
Immunomodulation
Ventilation strategy
#1
Quinolone
Hydrocortisone
6cc/kg
#2
Quinolone
Hydrocortisone
4cc/kg
#3
Quinolone
None
6cc/kg
#4
Quinolone
None
4cc/kg
#5
Combination Rx
Hydrocortisone
6cc/kg
#6
Combination Rx
Hydrocortisone
4cc/kg
#7
Combination Rx
None
6cc/kg
#8
Combination Rx
None
4cc/kg
• Equal randomization for first 200 patients (burn-in)
• RAR driven by a full factor statistical model
•
•
•
•
Factor for each single factor
Interactions of two factors
3-way interactions
Priors expecting low interactions, allows for learning
36
Trial Schematic
Domain 1 Domain 2 Domain 3
37
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
Statistical
Model
Data
Base
38
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
39
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
40
Trial Schematic: Adding a Factor
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
41
Trial Schematic: Adding a Domain
Domain 1 Domain 2 Domain 3 Domain 4
Data
IVRS
Statistical
Model
Data
Base
42
Trial Schematic: Selecting a Factor
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
43
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
44
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
45
Trial Schematic
Domain 1 Domain 2 Domain 3
Data
IVRS
Statistical
Model
Data
Base
46
Trial Simulation
Virtual Assumed Execution
Patients “Reality” Variables
“Observe”
Single Trials
Operating
Characteristics
(error rates & power)
48
Simulated Trial
Performance
Truth
49
Simulated Trial
Performance
Truth
Opportunity
51
Conclusions
• Adaptive trial designs can be used to create a
seamless process in which new evidence
about effectiveness is immediately used to
improve patient care
• A platform trial can extend this process
beyond a single treatment or few treatments
• Current work is focused on embedding this
approach into the health care infrastructure
• Patients will benefit if we merge clinical trials
and decision support into a single, continuous
process
52