Transcript Slide 1
Methodologic Issues and Next
Steps in the Implementation of
the Results Database
Deborah A. Zarin, M.D.
ClinicalTrials.gov
February 24, 2009
1
History of ClinicalTrials.gov
• FDAMA 113 (1997): Mandates Registry
– IND trials for serious and life-threatening diseases or
conditions
• ClinicalTrials.gov Launched in February 2000
• Calls for Increased Transparency of Clinical Trials
– Maine State Law; State Attorneys General
– Journal Editors (2004)
• ClinicalTrials.gov Accommodates Other Policies
• FDAAA 801 (2007): Expands Registry and Adds
Results Database
2
Public Law 110-85
Sec.801 Expanded Clinical Trial Registry
• Enacted on September 27, 2007
• Requires Trial Registration (Dec 2007)
– Phase II-IV drug and device trials for all diseases
– Data elements: ClinicalTrials.gov + ~ WHO/ICMJE
• Requires Results Reporting (Sept 2008)
– Trials of FDA-approved or cleared drugs and devices
– “Basic” Results: Baseline Characteristics, Primary &
Secondary Outcomes, Statistical Analyses
– Adverse Events (Sept 2009)
– “Expansion” of results by rulemaking (Sept 2010)
• Added enforcement provisions
3
Enforcement Provisions
• Notices of non-compliances
• Civil monetary penalties up to $10,000/day
• Withholding of NIH grant funds
Key Terms
• Applicable Clinical Trials
– Interventional trials
– Phase 2-4 drug, biologic, device
– >= one site in U.S.
– Ongoing as of 9/27/07, or later
• Responsible Party
– Sponsor, grantee
– PI if designated
• Completion Date
New Registrations
Continue to Increase
Number of New Records Since May 1, 2005
70,000
60,000
50,000
40,000
30,000
20,000
10,000
0
08
20
6/ 8
/2 0
10 1/20 8
2 0
9/ 7/20 8
1 0
8/ 3/20
1 8
7/ /200
8 8
6/ /200 8
4 0
5/ 0/20 8
3 0
3/ /20 8
24 0
2/ /20 07
20 0
1/ 6/2 07
/1 0
12 1/2 7
/1 00
11 7/2
/ 7
10 /200 7
2 0
9/ 9/20 7
2 0
7/ 4/20 7
2 0
6/ 0/20 7
2 0
5/ 5/20 7
1 0
4/ 1/20
1 7
3/ 200 06
4/ 0
2/ 1/2 06
/3 0
12 6/2 06
/2 0
11 2/2 6
/2 0
10 7/20 6
1 0
9/ 3/20
1 6
8/ /200
9 6
7/ /200 6
4 0
6/ 0/20 6
3 0
4/ 6/20 6
2 0
3/ 9/20 6
1 0
2/ /20 05
15 0
1/ 1/2 5
/1 0
12 6/20 5
/ 0
11 /20 5
/2 0
10 8/20 5
2 0
8/ 4/20 5
2 0
7/ 9/20 5
1 0
6/ /20 5
15 0
5/ 0/20
05
1 5
4/ /200 5 /20
6 0 1
3/ 0/20 2/3
3 1
1/ t to
ar
st
ClinicalTrials.gov Statistics
(as of 02/03/2009)
Number
Total
Type of Trial*
Observational
Interventional
– Drug & Biologic
– Surgical Procedure
– Behavioral, Gene
Transfer, Other
– Device**
International Sites (161 countries)
US only
Non-US only
US & Non-US mixed
Missing
*171 records missing Study Type information
**173 device trials – “delayed posting”
Percent
67,064
100%
10,690
57,119
42,684
8,585
16%
84%
7,997
3,862
32,772
23,109
4,064
7,119
49%
34%
6%
11%
7
ClinicalTrials.gov Statistics (cont.)
(as of 02/03/2009)
Number
Percent
18,088
21,072
28,820
67,980
27%
31%
42%
Trials by Data Provider
US Federal (including NIH)
Industry
University, Other
Total
User Statistics
Page Views per month
Unique visitors per month
40 Million
500,000
8
Basic Results Database
9
Basic Results Database:
General Characteristics
• Results of “applicable clinical trials” of
FDA-approved/cleared medical products
• Generally, submission within 12 months of
the earlier of estimated or actual trial
completion date (of primary outcome)
• Delayed Submission of Results
– Seeking approval of a new use
– Extensions for “good cause”
• E.g. data are still blinded
• NOT—awaiting publication
10
Wood AJJ. Progress and deficiencies in the registration of clinical trials. NEJM. 2009
Wood AJJ. Progress and deficiencies in the registration of clinical trials. NEJM. 2009
Basic Results Modules
•
•
•
•
•
Participant Flow
Baseline and Demographic Characteristics
Outcome Measures
Adverse Events (summary data)
Other Information
– “Certain Agreements” Restricting Results
Disclosure
– Overall Limitations and Caveats
– Results Point of Contact
Key Issues
1.
2.
3.
4.
5.
The Results Database is Working
Is the Results Database a Good Thing?
Who is the Audience?
What have we learned so far?
To Be Considered…
1. The Results Database is
Working
Current Status – “Basic Results”
(as of 02/06/09)
• Functional Web-based Data Entry System
• Launched in September 2008
• Ongoing system of feedback and
improvements
• 410 Results Records have been submitted
• Industry: 293 records from 72 data providers
• Other: 117 records from 80 data providers
• Anticipate greatly increased rate of
submission
16
Results Data Entry Process
Technical Issues
17
Design Requirements
• Display consists of data tables with minimal
text—must be self-explanatory
• System must accommodate range of study
designs and facilitate comparison across studies
• NLM directed to:
– Consider different methods of display based on
principles of risk communication for different
audiences
– Ensure the data are searchable in many ways
• Structured data entry required to facilitate search
and display needs
18
Design Features
• Tables are “constructed” by the data
provider
– Columns are pre-set as study arms, but can
be changed by the data provider
– Rows are measures—some are pre-set,
others are customized for each study
– Type of measure determines specific design
of “cells”
• Attempt to balance fixed structure with
flexibility
19
Principles for Using the Basic
Results Database
• Submitted data are used to develop basic
tables for the public display
• Tables must be interpretable by people not
familiar with each particular study
• Labels for rows, columns, and units of
measure must be meaningful and precise
20
Resources to Help Data Providers
• “Helpful Hints”
– Illustrates process for entering different study
designs (parallel, crossover, diagnostic
accuracy, bioequivalence—in progress)
• Webinar
• “Common Errors”
• Individual discussions regarding particular
studies
• Presentations
Results Database:
Challenges
• Data tables will be a public representation
of the study—must be clear and
informative
• Many entries are not logical or informative
• Posting such entries will embarrass all
parties and will not contribute to the public
good
22
Sample Posted Results
23
Arms
Milestone
Reasons Not Completed
Crossover Design
Multiple “Periods”
“Default” Required
Measures
User-Specified
Measure
Categories
Arms
29
Categories
Statistical Analysis
Statistical Analysis
34
Example: Study Record with Results including 29 Posted Outcome Measures
36
Results Data Submissions
(as of 2/23/09)
• 432 Studies Total
• By Sector
– 311 (72%) Private
– 121 (28%) Public
• Data Providers
– 76 Private (Mean: 4.1 studies/org)
– 104 Public (Mean: 0.9 studies/org)
Results Data Posted
(as of 2/20/09)
• 54 Total
• Study Type
– 3 (6%) Observational
– 51 (94%) Interventional
• Provider Category
– 3 (6%) NIH
– 35 (65%) Industry
– 16 (29%) Other
Results Data Posted (cont.)
(as of 2/20/09)
• Intervention Type
– 39 Drug & Biologic
– 6 Device
– 4 Behavioral
– 1 Medical Procedure
• Study Design
–
–
–
–
39 Randomized
23 Masked, double
15 Control, active
13 Control, placebo
• Adverse Events (optional)
– 7/54 (13%) Studies
• Phase
– 5 Phase 1
– 9 Phase 2
– 25 Phase 3
– 11 Phase 4
– 9 N/A
• Outcome Measures
[mean (range)]
– 6.6 Primary (1-49)
– 6.4 Secondary (0-49)
– <1 Other Prespecified (0-8)
• 33 of 87 include Adverse Event module
2. Is the Results Database a
Good Thing?
Powerful Search Features
• Synonymy
– UMLS, etc
• Hierarchies
• Spelling relaxation
What about lack of peer review?
• Large social experiment—results unknown
• On the other hand, industry (e.g., GSK)
results databases have been around
– Nissen—rosiglitazone
• Peer review is not a panacea
Published Results
• Search of PubMed Secondary Source ID
[SI] field by NCT number from 248 records
with results
– Identified 10 citations (NCT # in Abstract)
• Manual search of 79 records with results
in PubMed (without NCT numbers)
– Identified 2 citations (no NCT # in Abstract)
48
Conclusions:
Conclusions: Paroxetine
Paroxetine is
is generally
generally
well
well tolerated
tolerated and
and effective
effective for
for major
major
depression
depression in
in adolescents.
adolescents.
Publication Bias
Source: Figure 1A. Turner et al. (NEJM, 2008)
Source: Silverstein FE et al. JAMA. 2000 Sep 13;284(10):1247-55.
Kaplan-Meier estimates for ulcer complications according to traditional
definition. Results are truncated after 12 months, no ulcer complications
occurred after this period. Adapted from Lu 2001.
Source: Jüni P, Rutjes AW, Dieppe PA. BMJ. 2002 Jun 1;324(7349):1287-8.
Lee K, Bacchetti P, Sim I. PLoS Med. 2008;5(9): e191
Zarin DA, Tse T. Medicine. Moving toward transparency of clinical trials. Science. 2008 Mar 7;319(5868):1340-2.
3. Who is the Audience?
Is it possible to create “standalone” tables that are informative?
Quality Assurance Challenges
• Data tables will be the public
representation of the study—must be clear
and informative;
• NLM QA Focuses on:
– Apparent Validity (when possible)
– Meaningful Entries
– Internal consistency/logic
– Format
Common Quality Concerns
• Reporting of percentage without reporting
absolute numbers
• Improper use of terms
• Incidence
• Proportion and Ratio
• Frequency
• Reporting a change—lack of specificity
• Subtraction: minuend vs. subtrahend
• Ratio: nominator vs. denominator
• Complicated outcomes that cannot be
understood
58
Where is the Quality Line?
Domains of Quality:
Quality of Entries
Not Meaningful
Meaningful
Scope of Entries
Minimal
Comprehensive
QA Staff Resources
59
Registration and Results Data
Must be Consistent
• Participant Flow Numbers and Enrollment
• Study Design and Results Tables
– Number of Arms
BEFORE Revision (Public View)
Actual enrollment (229) displayed in
the protocol section does not match
total number started in the basic
results section (220 + 211 = 431)
Summary Protocol Section:
Actual Enrollment:
Study Start Date:
Study Completion Date:
Primary Completion Date:
229
June 2006
October 2007
October 2007 (Final data collection date for primary outcome measure)
Basic Results Section:
Participant Flow: Initial Treatment
Placebo
Drug X
STARTED
220
211
COMPLETED
218
210
2
1
NOT COMPLETED
61
Table Structure Must be
Logical
BEFORE Revision (Public View)
Measured Values
Number of Participants Analyzed
Treatment Satisfaction Questionnaire
After 18 Weeks of Treatment
[units: Score]
Mean ± Standard Deviation
Drug X,
Week 10
Drug X, Change from
Week 10 to 18
88
80
81 ± 17.46
7.9 ± 12.16
Inconsistency between columns
and rows: Measure “at week 10”
and Measure “after 18 weeks of
treatment”
63
Data Must Make Sense
• Outcome Measure Name, Description,
Units and Data are Compatible
BEFORE Revision (Public View)
Measured Values
Number of Participants Analyzed
Hours Per Day of Sleep
[units: Average Hours per Day]
Mean ± Standard Deviation
Intervention X
Control
28
27
823 ± 92
864 ± 106
Inconsistency between Units of
Measure, “average hours per day,” and
Measure Data: value provided is greater
than the total number of hours in a day
65
BEFORE Revision (Public View)
Secondary Outcome Measure: Use of Community Health Resources
Measure Type
Secondary
Measure Name
Use of Community Health Resources
Measure Description Evaluation of visits to primary care
pediatrician, hospital emergency and rehospitalization
Time Frame
Up to 3 months after discharge
Safety Issue
No
Implies number
of health
resources Measured
used –
how was it
measured?
• Data are inconsistent:
percentages of what?
• Invalid entry: needs to
be numerical (cannot
include “%”)
Values
Number of Participants Analyzed
Use of Community Health Resources
[units: Number]
Early
Discharge
Standard
Discharge
90
86
4.4%
10.5%
66
BEFORE Revision (Public View)
Secondary Outcome Measure: Frequency and Magnitude of Antibody Response
Measure Type
Primary
Measure Name
Frequency and Magnitude of Antibody Response
Measure Description
Nasal secretions to Virus A/12 and B/14. Antibody
Response: Three-fold increase after immunization
Time Frame
Visit 3 (Week 15)
Safety Issue
Same unit cannot
represent measures
of “frequency” and
“magnitude”
Yes
May mean “three-fold
or greater increase”
Measured Values
Vaccine,
Low Dose
Vaccine,
High Dose
Number of Participants Analyzed
35
34
Frequency and Magnitude of Antibody Response
[units: Participants]
17
21
“Participants” is not a unit of measure
for “frequency” or “magnitude”
Best to
Best
to provide
provideboth
bothcategories
categories
for a dichotomous
for
dichotomousmeasure:
measure:
3x increase
increase
•• << 3x
3x increase
increase
•• ≥≥ 3x
67
Tables Must Be Informative
• Scales should include:
– Full name
– Construct or domain (e.g., pain)
– Direction of scores (Best/Worst Value)
– Other information as necessary
• Measures Have Useful Descriptions
• Avoid Abbreviations
BEFORE Revision (Public View)
Need information
about these values
(e.g., is “0” better or
worse than “2”?)
Need information about this scale
• Full Name
• Construct/domain
• Range and directionality
Baseline Measures
Investigational Drug X
GOG Performance Status
[units: Score]
Are these the only
possible scores?
0
48
1
27
2
4
Need to change to “participants” –
data represent “number of
participants” with a particular score
69
BEFORE Revision (Public View)
Needs description:
Duration of what?
Secondary Outcome Measure: Duration (Days)
Measure Type
Secondary
Measure Name
Duration (Days)
Measure Description
Extent of Exposure for All Treated Subjects
Time Frame
Duration of Study
Safety Issue
No
Needs Arm Label: What
is the intervention?
Measured Values
Open Label
Number of Participants Analyzed
Duration (Days)
[units: Days]
Mean ± Standard Deviation
403
195.5 ± 43.87
Measure Information Must be
Precise and Accurate
• Avoid misuse of terms, e.g.,
– proportion
– ratio
– incidence
• State what is being measured and how
– Do not provide results in measure description
field
BEFORE Revision (Public View)
Spell out acronym
Primary Outcome Measure: Proportion of Patients with Controlled SBP
Measure Type
Primary
Measure Name
Proportion of Patients with Controlled SBP
Measure Description
Controlled SBP defined as SBP < 130 mmHg
Time Frame
Baseline to 12 weeks
Safety Issue
No
Not a proportion
Drug X
Drug Y
Drug X + Y
Number of Participants Analyzed
351
361
384
Proportion of Patients with
Controlled SBP
[units: Participants]
186
135
287
BEFORE Revision (Public View)
Spell out acronym
Primary Outcome Measure: Change in Sitting DBP From Baseline to End of Study
Measure Type
Primary
Measure Name
Change in Sitting DBP From Baseline to End of Study
Measure Description
Change in Sitting DBP
Time Frame
Baseline to 12 weeks
Safety Issue
No
Number of Participants Analyzed
Change in Sitting DBP From
Baseline to End of Study
[units: mmHg]
Lease Squares Mean ± Standard Error
Specify calculation
details: which value was
subtracted from which?
Drug X
Drug Y
Drug X + Y
351
361
384
-8.4 ± 0.2
-6.7 ± 0.2
-11.2 ± 0.3
BEFORE Revision (Public View)
State what is
being measured,
not the purpose
Needs description:
Secondary Outcome Measure: To Compare Drug X and Drug Y
for isEfficacy
what
being
Measure Type
Secondary
Measure Name
To Compare Drug X and Drug Y for Efficacy
Measure Description
Time Frame
4 months
Safety Issue
No
measured and how?
Data in All Tables Must be
Internally Consistent and Logical
• Participants must “flow”
• “Number analyzed” must be consistent
with participant flow data
• Avoid Illogical Entries
BEFORE Revision (Public View)
Participant Flow: First Period
Placebo
Drug X
STARTED
301
299
COMPLETED
291
285
NOT COMPLETED
10
14
Number of participants STARTED in
second period of Participant Flow
needs to be the same as numbers
COMPLETED in the first period
Participant Flow: Second Period
Placebo
Drug X
STARTED
298
290
COMPLETED
288
278
NOT COMPLETED
10
12
76
BEFORE Revision (Public View)
Measured Values
Number of Participants
Hours Per Day of Sleep
[units: Average Hours per Day]
Mean ± Standard Deviation
Intervention X
Control
28
27
823 ± 92
864 ± 106
Inconsistency between Units of
Measure, “average hours per day,” and
Measure Data: value provided is greater
than the total number of hours in a day
77
Statistical Analyses
• Must be Logical
• Compatible with Data
• Informative (report informative metrics)
BEFORE Revision (Public View)
Measured Values
Early Discharge
Standard Discharge
100
100
9.3 ± 1.2
7.8 ± 2.1
Number of Participants
Parental Stress
[units: Points on a Likert Scale]
Mean ± Standard Deviation
Inconsistency between Measure Data
and Method of Estimation
• Reported Mean Difference: “9”
• By Inspection: 9.3 – 7.8 = 1.5
Statistical Analysis 1 for Parental Stress
Groups
Early Discharge vs. Standard Discharge
Method
ANOVA
P-Value
0.05
Mean Difference (Net) 9
79
BEFORE Revision (Public View)
Needs description
Secondary Outcome Measure: Time to Relapse of a Mood Episode
Measure Type
Secondary
Measure Name
Time to Relapse of a Mood Episode
Measure Description
Time Frame
24 months
Safety Issue
No
Measured Values
Number of Participants
Time to Relapse of a Mood Episode
[units: Days]
Median (Inter-Quartile Range)
Placebo
Investigational Drug X
148
153
219 (83 to NA)
NA (173 to NA)
80
Invalid entry
Who is the Audience?
PI and Clinical Research Team (You!)
Other Medical Researchers in same field
[Study Sponsor]
Other Medical Researchers in other fields
Other Readers of the medical literature
Science Writers
Lay Public (readers of consumer health literature)
Lessons Learned from Early
Submissions of Basic Results
• Many iterations with the QA staff are
necessary to reach minimal quality
standards and to correct serious flaws
• Data Providers must be able to
understand the study design and data
analysis
– Typically, the investigator and a
statistician will need to be involved
82
4. What Have We Learned So
Far?
“Studies” vary in size!!
•
1 Primary Outcome Measure
– 1 Statistical Analysis
• 13 Secondary Outcome Measures
– 147 Statistical Analyses
• 31 Other Pre-specified Outcome
Measures
– 351 Statistical Analyses
84
Primary Outcome Measure
• Sum of Sore Throat Pain Intensity
Difference (SPID2) on Swallowing at 2
hours Post-First Dose
85
Sample Secondary
Outcome Measures
• Sore Throat Pain Intensity Difference (PID)
Within 6 hours Post-First Dose
• Sum of Sore Throat Pain Intensity Difference
(SPID) Within 6 hours Post-First Dose
• Sore Throat Relief Rating Scale (STRRS) Within
6 hours Post-First Dose
• Sore Throat Relief Rating Scale (STRRS) 'Moderate Relief' at 6 hours Post-First Dose
• Patient's Global Evaluation of Study Medication
at 6 hours Post-First Dose
86
Sample Other Pre-specified
Outcome Measures
•
•
•
•
•
•
•
•
•
Throat Soreness Scale (TSS) Difference Within 6 hours Post-First Dose
Sum of Throat Soreness Scale (TSS) Difference Within 6 hours Post-First
Dose
Difficulty Swallowing Scale (DSS) Difference Within 6 hours Post-First Dose
Sum of Difficulty Swallowing Scale (DSS) Difference Within 6 hours PostFirst Dose
Difficulty Swallowing Scale (DSS) Difference at Least 50% Gone at 6 hours
Post-First Dose
Subjects with >= 50% Total Pain Relief (TOTPAR) at 6 hours Post-First
Dose
Number Needed to Treat (NNT) to Achieve at Least 50% of Maximum Total
Pain Relief (TOTPAR) at 6 hours Post-First Dose
Subjects with Sore Throat Pain at Least 35% Gone and at Least 50% Gone
at 2 and 6 hours Post-First Dose
Subjects who Achieved Their Own Level of 'Meaningful Relief' Within 6
hours who had Perceptible Relief Onset Time Within 1 hour
87
29 Posted Outcome Measures
88
Interesting Findings to Date
• Large numbers of submitted Outcome
Measures and Statistical Analyses
• Power of Defaults (e.g., “Baseline
Measures”)
– Age > 65
– Race and Ethnicity
– Region of Enrollment
• Problems with imprecise entries
89
Study-Specific Measures
Characteristics
(n = 87 submitted records with results)
•
•
•
•
Total: 134 Measures
Mean: 1.5 Measures per Record
Range: 0 to 16
Examples
– Baseline LDL-C strata
– Daytime asthma symptom score
– Glomerular filtration rate
– Maternal body mass index
– Type of allergic rhinitis
Distribution of Measure Types
for Study-Specific Measures
(n = 87)
• 42 Number
• 78 Mean
• 14 Median
• 0 Least Squares Mean
• 0 Geometric Mean
• 0 Log Mean
Distribution of Measure of Dispersion
for Study-Specific Measures
(n=87)
• 42 Not Applicable
• 73 Standard Deviation
• 2 Inter-Quartile Range
• 17 Full Range
Distribution of Measure of Dispersion
for All Outcome Measures
(n=87)
• 149 Not Applicable
• 169 Standard Deviation
•
5 Inter-Quartile Range
• 20 Full Range
• 37 Standard Error
• 59 95% Confidence Interval
Statistical Analysis
Data Structure
Method - Choices
•
•
•
•
•
•
•
•
•
ANCOVA
ANOVA
Chi-squared
Chi-squared, Corrected
Cochran-MantelHaenszel
Fisher Exact
Kruskal-Wallis
Log Rank
Mantel Haenszel
•
•
•
•
•
•
•
•
•
•
McNemar
Mixed Models Analysis
Regression, Cox
Regression, Linear
Regression, Logistic
Sign Test
t-Test, 1-sided
t-Test, 2-sided
Wilcoxon (Mann-Whitney)
Other
“Other” Statistical Test Methods
• Percent of ITT subjects
• LL of 97.5% CI of
difference
• Kaplan Meier
• Rank-Sum Test
• Student's t-test
• binomial distribution
• Linear mixed model
• t-Test, Paired
•
•
•
•
2-sided Log Rank test
General Linear Model
McNemar-Bowker
Nonparametric
ANCOVA
• Longitudinal Data
Analysis (LDA)
• One proportion
binomial exact test
• Wilcoxon Rank-Sum
Estimated Parameter - Choices
• Cox Proportional
Hazard
• Hazard Ratio (HR)
• Hazard Ratio, log
• Mean Difference
(Final Values)
• Mean Difference (Net)
• Median Difference
(Final Values)
• Median Difference
(Net)
• Odds Ratio (OR)
• Odds Ratio, log
• Risk Difference (RD)
• Risk Ratio (RR)
• Risk Ratio, log
• Slope
• Other
“Other” Parameter Names
• Mean
• Mean Response Ratio
• Difference in cumulative
probability
• Kaplan Meier Estimate
• Cumulative probability
• LS mean difference of
SPID48
• Least Square Mean
Change Difference
• Proportion of patients
achieving control
• Geometric leastsquares mean ratio
• Rate Difference
• Percentage
• Percentage of patients
castrated
• Clopper-Pearson
Distribution of Estimation Parameter
Method for All Statistical Analyses
(n=87 records)
•
0 Median Difference
(Net)
•
2 Odds Ratio (OR)
•
0 Odds Ratio, log
• 63 Mean Difference
(Final Values)
•
1 Risk Difference (RD)
•
2 Risk Ratio (RR)
•
2 Mean Difference (Net)
•
0 Risk Ratio, log
•
3 Median Difference
(Final Values)
•
0 Slope
•
0 Cox Proportional
Hazard
•
2 Hazard Ratio (HR)
•
0 Hazard Ratio, log
• 99 Other
Statistical Analyses
Characteristics
(n = 87 submitted records with results)
• Total: 305 Analyses
• Per Record
– Mean: 3.5
– Median: 0
– Range: 0 – 40
• Per Outcome Measure
– Mean: 0.8
– Median: 0
– Range: 0 – 20
Distribution of Statistical Test Method
for All Statistical Analyses
(n=87 records)
• 55 ANCOVA
• 2 ANOVA
• 4 Chi-squared
• 0 Chi-squared,
Corrected
• 12 Cochran-MantelHaenszel
• 12 Fisher Exact
• 2 Kruskal-Wallis
• 9 Log Rank
• 0 Mantel Haenszel
• 1 McNemar
• 10 Mixed Models
Analysis
• 0 Regression, Cox
• 0 Regression, Linear
• 2 Regression, Logistic
• 0 Sign Test
• 3 t-Test, 1-sided
• 0 t-Test, 2-sided
• 4 Wilcoxon (MannWhitney)
• 64 Other
Posted Basic Results: Certain Agreements
(as of 1/12/09)
• 8 PIs are Employees
• 36 PIs are Not Employees
– 15 No Agreement
– 21 Has Agreement
• 7 Embargo ≤ 60 days
• 7 Embargo > 60 days, but ≤ 180 days
• 7 Other*
* “The PI shall have the right to publish the results of research to include in any
publication. The PI will provide the Sponsor with at least sixty (60) days for review
of a manuscript. No paper that incorporates Sponsor Confidential Information will
be submitted for publication without Sponsor’s prior written agreement.”
104
5. Issues To Be Considered
•
•
Display
Content
Display
• Tracking and Displaying Changes
• Displaying Scale Information from a “data
bank”
• Links to Systematic Reviews and other
“authoritative information”
• Links to Consumer Health Information
Public View and Changes to
Records—Current Policy
• Default public view is always most recent
entry in registry and results records
• Records are never “frozen”
– Data provider can change entries at any time
• Post “First received” and “last updated”
• Archive site “tracks changes”
• Concerns:
– Not all users understand what they are seeing
– May not be best policy
Issues
• Distinguish between “freezing” data and
default display
• Registry issues
– Different users/uses
• Recruitment of participants
• Scientific integrity
• Results issues
– New data may be added
– Existing data may be improved, changed or
deleted
108
Data Element “Permanence”
• Changes Rarely
–
–
–
–
–
Sponsor
Unique Protocol ID
Study Start Date
Condition & Intervention
Study Design
• May Change
– Outcome Measures
• Expected to Change
– Recruitment Status, Overall and by Location
– Enrollment (expected actual)
– Primary Completion Date (expected actual)
Factors leading to changes in data
elements
• Must keep data up to date, including
location and recruitment status
• Evolving policies, database structure
• QA requests for changes
• Correction of errors
• Added specificity
• On the other hand:
– Requirement to report outcome measures
– Requirement re: completion date
110
Example: Change in
Recruiting Status
History of NCT00239681
(The JUPITER Trial)
• 11-13-08: Changed Overall Status from
“Active, not yet completed” to “Terminated”
• 11-16-08: Changed Overall Status from
“Terminated” to “Completed”
• 02-03-09: Changed Overall Status from
“Completed” to “Terminated”
Last Update
Tracking Outcome Measures
• How to distinguish between additions,
deletions, and modifications to primary
and secondary outcome measures?
• How to handle changes that transform one
general measure to several more specific
measures?
Challenge of Tracking Outcome
Measures
ORIGINAL SUBMISSION
UPDATED SUBMISSION
Blood Pressure (BP) at 3 mos.
Systolic BP (SBP) at 3 mos.
Diastolic BP (DBP) at 3 mos.
Change in SBP at 3 mos. to baseline
Change in DBP at 3 mos. to baseline
Percent Change in SBP…
Percent Change in DBP…
ETC.
Blood Pressure (BP) at 3 mos.
Diastolic BP (DBP) at 12 weeks
DBP at 9 mos.
DBP at 12 mos.
Blood Pressure (BP) at 3 mos.
Quality of Life at 6 mos.
Example from Current Record
BEFORE
?
•
Effectiveness of the SAGB post-operatively
(for the follow-up period)
•
Rate of device-related adverse events and
SAGB malfunctions in subjects implanted
with the SAGB throughout the postoperative follow-up period.
•
Changes in excess body weight throughout
post-operative follow-up period
•
Change in body mass index (BMI)
throughout the post-operative follow-up
period
•
Absolute weight loss and percent change in
absolute weight throughout the postoperative follow-up period
•
Change in Quality of Life measures
throughout the post-operative follow-up
period
?
•
Changes in specific laboratory test values
throughout the post-operative follow-up
period
•
Rate of all adverse events in subjects
implanted with the SAGB throughout the
post-operative follow-up period.
AFTER
?
• Percent Excess Weight Loss
• Percent of Adverse Events With
Swedish Adjustable Gastric Band
(SAGB)
• Changes in Quality of Life (QOL)
Measures
• Changes in Excess Body Weight
(EBW)
• Changes in Total Cholesterol ?
• Changes in Body Mass Index (BMI)
• Change in Absolute Weight
• Number of All Adverse Events of
Subjects Implanted With the SAGB
http://clinicaltrials.gov/archive/NCT00166205/2008_12_10/changes
Other Issues Related to
Outcome Measures
• What is a “primary” outcome measure?
– Can there by 5? 10?
• What is a “secondary” outcome measure?
– Can there be 20? 50?
• Where does “outcome measure” end and
“analytic plan” begin?
• Can the same display be used for the lay
public (recruitment) and to ensure
scientific integrity?
119
Proposed Change
• Add a “box” in tabular view that highlights
key facts about the record
• For each data element, indicate whether
or not it’s been changed
• Provide link to “history of changes” at each
data element
• Make it easier to access your preferred
view
Summary Tracking Information
First Registered
October 14, 2005
Study Start Date
February 2, 2003
First Posted Outcome Measures
October 14, 2005
Outcome Measures
(First Posted)
Investigate whether long-term treatment with rosuvastatin
compared with placebo will decrease the rate of major
cardiovascular events
Investigate the safety of long-term treatment with
rosuvastatin compared with placebo through comparisons of
events
Investigate whether therapy with rosuvastatin reduces the
incidence of diabetes mellitus, venous thromboembolic
events, & the incidence of bone fractures.
Last Changed Outcome Measures
Outcome Measures
Click here to see full history of the record
No changes since
initial registration
Bank of Scales
• Create a database of standard outcome
measures, scales and possibly
interventions
• Allow data providers to select from “bank”
when appropriate (e.g., Hamilton Rating
Scale for Depression)
• Option to add new scales
– Data provider specific “bank”
– Suggestions for additions to “general bank”
BEFORE Revision (Public View)
Need information
about these values
(e.g., is “0” better or
worse than “2”?)
Need information about this scale
• Full Name
• Construct/domain
• Range and directionality
Baseline Measures
Investigational Drug X
GOG Performance Status
[units: Score]
Are these the only
possible scores?
0
48
1
27
2
4
Need to change to “participants” –
data represent “number of
participants” with a particular score
125
Scale Information
• Beck Depression Inventory II (Beck, Steer &
Brown, 1996) [NCT00056316]
– “21-item self-report instrument to assess
severity of symptoms of depression. There is a
four-point scale for each item ranging from 0 to
3. The total continuous score can range from 0
to 63 points, with higher scores reflective of
greater severity.”
• Source of information
– Always data provider?
– Other authoritative source(s)?
Possible Links
• FDA information
• Publications (use nct#s)
• Systematic reviews
– Same condition
– Same intervention (or intervention class)
– How to choose?
• Medline Plus?
• How to read a study?
127
Key Issues in
Expansion of Results Database
• Expand results reporting to trials of unapproved
products?
• Include narrative summaries? Can it be done
w/out being promotional and misleading?
– Technical
– Lay Language
• Data Quality Verification
– Process (e.g., Pilot Quality Control Project)
– External Sources
• Full protocol versus extract “necessary to help
evaluate the results”
Issues (cont)
• Results Submission
– New Submission Deadline: up to 18 months?
• Other
– Implementation of adverse event
requirements
• Threshold for non-serious?
– Definition of applicable clinical trial
Public Meeting
•
•
•
•
“Expansion” of results database
April 20,2009
NIH campus
Please come!
130
Pilot Quality Control Study
• 12/243 results records appear to have one
or more journal publications
• 19/43 posted results summaries have
some entry at drugs@FDA
• What to use as reference standard for this
study?
131
Applicable Clinical Trials?
• Components involve devices and/or drugs
– ImageChecker DMax computer-aided detection
system, version 8.1 (Hologic/R2 Technology)
Gilbert FJ et al. N Engl J Med 2008;359:1675-84.
Additional Information
• Email LISTSERV and other FDAAA
information:
– http://prsinfo.clinicaltrials.gov/fdaaa.html
• Other general information:
– http://prsinfo.clinicaltrials.gov
• Questions?
– [email protected]
135
Finding Results at ClinicalTrials.gov
• From Homepage
– Go to “Search for Clinical Trials”
– Select “Advanced Search”
– Select “Studies with Results” from the menu
for the Study Results field
– Select study record from results list
– Click “Study Results” tab
136
137
http://prsinfo.clinicaltrials.gov/fdaaa.html
138
139
140
141
FDAAA and Prior Publishing
142
ICJME
“…will not consider results posted in the
same primary clinical trials register in which
the initial registration resides as previous
publication if the results are presented in the
form of a brief, structured (<500 words)
abstract or table.”
143
BMJ
“Firstly, disclosure will be a legal requirement, so
there is nothing editors can do about it if they still
want to publish important trials of drugs and
devices. Moreover, journals will continue to add
value by publishing useful and readable trial reports
that clinicians, the media, and patients can interpret
and use. And, most importantly, the results
disclosed for the FDA will not have been externally
peer reviewed and will be preliminary. Peer review
not only provides a stamp of quality assurance, it
often leads to reanalysis of results.”
144
PLoS
“supports the public disclosure of all clinical
trial results, as mandated for example by the
FDA Amendments Act, 2007. Prior disclosure
of results on a public website such as
clinicaltrials.gov will not affect the decision to
peer review or acceptance of papers in PLoS
journals.”
145