2007-2008 Accomplishments
Download
Report
Transcript 2007-2008 Accomplishments
Verification of Tropical
Cyclone Forecasts
Beth Ebert (BOM)
Barb Brown (NCAR)
Laurie Wilson (RPN)
Tony Eckel (ERT)
8th TIGGE Working Group Meeting
22-24 February 2010, Geneva
What kind of TC forecasts?
Deterministic
• TC track
Ensemble
• Track distribution
– strike prob., cone of uncertainty
• Intensity
• Intensity distribution
– mean / median
– spread
– 90th percentile
– maximum wind
– central pressure
– temporal trend (rapid
intensification)
•
•
•
•
Wind field / radii
Precipitation
Storm surge
Temporal consistency
•
•
•
•
Prob (wind > threshold)
Prob (precip > threshold)
Prob (surge > threshold)
Temporal consistency
Different users need different kinds of
verification information
• Public and emergency managers
– Simple, graphical
– Focus on impact
• Forecasters
– Information on how to interpret forecasts
– Timing errors
• Modellers
– Systematic errors
• How to improve the model
• How to improve ensemble distribution / spread
Quality of deterministic forecasts
• What are the track errors (along-track, crosstrack)?
• What are the intensity errors?
• Are temporal intensity trends correctly predicted?
• What is the error in timing of landfall?
• What is the error in forecast maximum wind
(rain)?
– Multi-day total precipitation
• Is the spatial distribution of wind (rain) correct?
Quality of ensemble forecasts
• Does the ensemble enclose the observed track?
• Are the ensemble probabilities skillful and reliable
predictions for
–
–
–
–
strike probability (track)
intensity (max wind, central pressure)
wind
precipitation
• Does the ensemble produce an appropriate
spread for these variables?
Data issues
• Verification data
– Data sources
• Best track, Dvorak, surface instruments, radar, …
– Problems measuring in extreme conditions
• Forecast data
– Size of ensemble
– Model grid resolution
– Details of cyclone tracker
• Reference forecasts
– Statistical forecast (e.g., CLIPER)
Verification methods for deterministic
TC forecasts
• Example 1 – visual comparison
Analysis
Model
Track/intensity verification
Courtesy Noel
Davidson, BOM
Verification methods for deterministic
TC forecasts
• Example 2 – along-track and cross-track errors
Courtesy James
Franklin, NHC
Verification methods for deterministic
TC forecasts
• Example 3 – cumulative distribution of track
errors
Courtesy James
Franklin, NHC
Verification methods for deterministic
TC forecasts
• Example 4 – distribution of intensity errors
HFIP High-Resolution
Hurricane Test – DTC
Final Report Sept 2009
Verification methods for deterministic
TC forecasts
• Example 5 – rapid intensification
AOML / WRF – 69 cases
Observed
High resolution model
Low resolution model
Count
Hours since onset of observed RI
HFIP High-Resolution
Hurricane Test – DTC
Final Report Sept 2009
Verification methods for deterministic
TC forecasts
• Example 6 – rain intensity distribution
Marchok et al. 2008
Verification methods for ensemble TC
forecasts
• Example 1 – visual comparison
Verification methods for ensemble TC
forecasts
• Example 2 – probabilistic scores and methods
fake
^
MOGREPS 120 h forecast for
strike probability (within 120 km)
TC Anja, November 2009
Verification methods for ensemble TC
forecasts
• Example 3 – spread of track and intensity
forecasts
track
intensity
20-member FIM ensemble
New approaches for verifying TCs
• Spatial verification methods
Precipitation and
wind fields
Storm characteristics
• location
• size
• intensity
• shape, etc.
New approaches for verifying ensemble
TC forecasts
•
Minimum spanning tree
– multi-variate rank histogram
•
Ensemble of object properties
– ensemble mean object properties
– distributions – use standard
methods for ensembles and
probability forecasts
– relationship of TC genesis to the
number of ensemble members
predicting the TC at day 1+?
– correspondence ratio
observation
ensemble
forecast
Reporting guidelines
• Provide all relevant information
– Model(s), grid, range of dates, forecast lead times,
verification data source, etc.
• Aggregation and distribution of results
• Confidence intervals / uncertainty
Verification tools for TCs
• US Developmental Testbed Centre (DTC)
– Tracker (Marchok)
– Code for verification of TC track and intensity
• Other international tool sets??
Document on TC verification – commented
literature review to be written this year
Contents
1. Introduction
2. Verification strategy
3. Reference data
4. Verification methods
5. Reporting guidelines
6. Summary
References
Appendices:
a. Brief description of scores
b. Guidelines for computing aggregate
statistics
c. Confidence intervals for verification
scores
d. Examples of graphical verification
products