Severe Weather Applications

Download Report

Transcript Severe Weather Applications

Severe Weather Applications
David Bright
NOAA/NWS/Storm Prediction Center
[email protected]
AMS Short Course on
Methods and Problems of Downscaling
Weather and Climate Variables
January 29, 2006
Atlanta, GA
Where Americas Climate and Weather Services Begin
Outline
• Overview of the Storm Prediction Center (SPC)
• Implicit downscaling and hazardous mesoscale
phenomena
– Parameter evaluation
• SPC ensemble diagnostics
Outline
• Overview of the Storm Prediction Center (SPC)
• Implicit downscaling and hazardous mesoscale
phenomena
– Parameter evaluation
• SPC ensemble diagnostics
Overview of the SPC: Mission
The Storm Prediction Center (SPC) exists
solely to protect life and property of the American people
through the issuance of timely and accurate watch and
forecast products dealing with hazardous mesoscale
weather phenomena.
Overview of the SPC
HAZARDOUS PHENOMENA
•
•
•
•
Hail, Wind, Tornadoes
Excessive rainfall
Fire weather
Winter weather
Overview of the SPC Products
•
TORNADO & SEVERE THUNDERSTORM
WATCHES
•
WATCH STATUS MESSAGE
•
CONVECTIVE OUTLOOK
•
MESOSCALE DISCUSSION
•
FIRE WEATHER OUTLOOK
•
OPERATIONAL FORECASTS ARE BOTH
DETERMINISTIC AND PROBABILISTIC
75% of all SPC products are
valid for < 24h period
Outline
• Overview of the Storm Prediction Center (SPC)
• Implicit downscaling and hazardous mesoscale
phenomena
– Parameter evaluation
• SPC ensemble guidance
Implicit Downscaling
• We don’t explicitly downscale at the SPC
• However, SPC forecasters implicitly incorporate
spatial and temporal downscaling
– Models are run at O(10 km) grid spacing
– Model output available at O(hours)
– Minimum grid spacing to resolve explicitly modeled
convection ~3 km
– Even if thunderstorms (and mesocyclones) are explicitly
modeled, severe phenomena (hail, wind, tornadoes)
occur at finer scales
• Idealized example…
Trough and associated cold front within the domain of a mesoscale model
ΔX ~ 10 km
Narrow region of pre-frontal convergence
Convergence region
minimally resolved by
mesoscale model at
about 4 ΔX
ΔX ~ 10 km
Thunderstorms then develop within pre-frontal convergence zone
Thunderstorms are
not resolved by
mesoscale model
at only 1 to 2 ΔX
ΔX ~ 10 km
The ability to predict phenomena in an NWP model is scale dependent
A grid point model:
• does not resolve wavelengths of ~1-3ΔX
• minimally resolves wavelengths of ~4ΔX
• fully resolves wavelengths of ~10ΔX
ΔX ~ 10 km
SPC Downscaling and
Parameter Evaluation
• Today’s NWP models do not explicitly predict
most hazardous mesoscale phenomena of
interest to the SPC
• The human needs to understand interactions
between the large-scale (well resolved)
environment and storm-scale (poorly resolved)
phenomena
• Parameter evaluation (e.g., Johns and Doswell
1992)
Parameter Evaluation:
CAPE vs. Deep Layer Shear

Shear 
CAPE
Adapted from AMS Monograph Vol. 28 Num. 50 Pg. 449
Refined Parameter Investigations
A simple product of CAPE and shear
90%
75%
50%
25%
10%
Gradual increase between classes, with
discrimination between thunder, severe,
and significant severe
A complex parameter
space is evaluated for
modern severe storm
forecasting
Outline
• Overview of the Storm Prediction Center (SPC)
• Implicit downscaling and hazardous mesoscale
phenomena
– Parameter evaluation
• SPC ensemble diagnostics
Example 1
• Basic Ensemble CAPE and Shear
Analysis
SREF Parameter Evaluation
CAPE (J/kg)
Green solid= Percent Members >= 1000 J/kg; Shading >= 50%
Gold dashed = Ensemble mean (1000 J/kg)
F036: Valid 21 UTC 28 May 2003
• Probability
surface CAPE
>= 1000 J/kg
– Generally
low in this
case
– Ensemble
mean < 1000
J/kg (no gold
dashed line)
SREF Parameter Evaluation
10 m – 6 km Shear (kts)
Green solid= Percent Members >= 30 kts; Shading >= 50%
Gold dashed = Ensemble mean (30 kts)
F036: Valid 21 UTC 28 May 2003
• Probability
deep layer
shear >= 30 kts
– Strong mid
level jet
through Iowa
SREF Parameter Evaluation
3 Hour Convective Precipitation >= 0.01 (in)
Green solid= Percent Members >= 0.01 in; Shading >= 50%
Gold dashed = Ensemble mean (0.01 in)
F036: Valid 21 UTC 28 May 2003
• Convection
likely WI/IL/IN
– Will the
convection
become
severe?
SREF Parameter Evaluation
Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >= .01”
F036: Valid 21 UTC 28 May 2003
• Combined
probabilities
very useful
• Quick way to
determine
juxtaposition
of key
parameters
• Not a true
probability
– Not
independent
– Different
members
contribute
SREF Parameter Evaluation
Prob Cape >= 1000 X Prob Shear >= 30 kts X Prob Conv Pcpn >= .01”
•
•
F036: Valid 21 UTC 28 May 2003
Combined
probabilities a
quick way to
determine
juxtaposition of
key parameters
Not a true
probability
– Not
independent
– Different
members
contribute
•
Fosters an
ingredients-based
approach on-thefly
Severe Reports
Red=Tor; Blue=Wind; Green=Hail
Example 2
• Calibrated, Probabilistic Severe
Thunderstorm Guidance
Bright and Wandishin (Paper 5.5, 18th Conf. on Prob. and Statistics, 2006)
SVR WX ACTIVITY
12Z 11 May 2005 to 12Z 12 May, 2005
a= Hail; w=Wind; t=Tornado
SREF 24h calibrated probability of a severe thunderstorm
F027 Valid 12 UTC 11 May 2005 to 12 UTC 12 May 2005
Example 3
• Calibrated, Probabilistic Cloud-toGround Lightning Guidance
Bright et al. (2005), AMS Conf. on Meteor. Appl. of Lightning Data
Essential Ingredients to Cloud
Electrification
• Identify what is most important and readily available
from NWP models
• From: Houze (1993); Zipser and Lutz (1994);
MacGorman and Rust (1998); Van Den Broeke et al.
(2004)
– Super-cooled liquid water and ice must be present
– Cloud top exceeds charge-reversal temperature zone
– Sufficient vertical motion in cloud from mixed-phase region
through the charge-reversal temperature zone
Combine Ingredients into Single
Parameter
• Three first-order ingredients (readily available
from NWP models):
– Lifting condensation level > -10o C
– Sufficient CAPE in the 0o to -20o C layer
– Equilibrium level temperature < -20o C
• Cloud Physics Thunder Parameter (CPTP)
CPTP = (-19oC – Tel)(CAPE-20 – K)
K
where K = 100 Jkg-1 and CAPE-20 is MUCAPE in the
0o C to -20o C layer
Consider this Denver
sounding from
00 UTC 4 June 2003
EL Temp
-20o C
CAPE-20
0o C
LCL Temp
CPTP=(-19oC – Tel)(CAPE-20 – K)
K
CAPE-20 ~ 450 Jkg-1
o
Tel ~ -50 C
K = 100 Jkg-1
=> CPTP = 108
Operational applications really
only interested in CPTP > 1
Now consider this
Vandenberg sounding
on 00 UTC 3 Jan 2004
CPTP=(-19oC – Tel)(CAPE-20 – K)
K
EL Temp
-20o C
CAPE-20
0o C
LCL Temp
CAPE-20 ~ 160 Jkg-1
o
Tel ~ -17 C
K = 100 Jkg-1
=> CPTP = -1.2
Although instability exists and
models forecast convective pcpn,
warm equilibrium level (-17 C)
implies lightning is unlikely
(CPTP < 0)
SREF Probability CPTP > 1
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004
Uncalibrated probability: Solid/Filled; Mean CPTP = 1 (Thick dashed)
SREF Probability Precip > .01”
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004
Uncalibrated probability: Solid/Filled; Mean precip = 0.01” (Thick dashed)
Joint Probability (Assume Independent)
P(CPTP > 1) x P(Precip > .01”)
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004
Uncalibrated probability: Solid/Filled
Uncalibrated Reliability
(5 Aug to 5 Nov 2004)
Frequency
[0%, 5%, …, 100%]
Perfect Forecast
No Skill
Climatology
P(CPTP > 1) x P(P03I > .01”)
Calibrated Ensemble Thunder Probability
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004
Calibrated probability: Solid/Filled
Calibrated Ensemble Thunder Probability
3 hr valid period: 21 UTC 31 Aug to 00 UTC 01 Sept 2004
15h Forecast Ending: 00 UTC 01 Sept 2004
Calibrated probability: Solid/Filled; NLDN CG Strikes (Yellow +)
Calibrated Reliability
(5 Aug to 5 Nov 2004)
Frequency
[0%, 5%, …, 100%]
Perfect Forecast
Perfect Forecast
No Skill
No Climatology
Skill
Calibrated Thunder Probability
Example 4
• Calibrated, Probabilistic Snowfall
Accumulation on Roads Guidance
Goal: Examine the parameter space around the
lower PBL T, ground T, and precip type and calibrate
using road sensor data.
•
SREF probability predictors
(1) Two precipitation-type algorithms
•
•
Baldwin algorithm in NCEP post.
Czys algorithm applied in SPC SREF post-processing.
(2) Two parameters sensitive to lower tropospheric
and ground temperature
•
•
Snowmelt parameterization: Evaluates fluxes to
determine if 3” of snow melts over a 3h period.
Simple algorithm: Function of surface conditions,
F (Tpbl, TG, Qsfc net rad. flux,)
Example: New England Blizzard (F42: 23 January 2005 03Z)
SREF 32F Isotherm
(2 meter air temp)
SREF 32F Isotherm
(Ground Temp)
Mean (dash)
Mean (dash)
Union (At least
one SREF member at
or below 32 F - dots)
Union (At least
one SREF member at
or below 32 F - dots)
Intersection (All members
at or below 32F- solid)
Intersection (All members
at or below 32F- solid)
3h probability of freezing or
frozen pcpn (NCEP algorithm;
uncalibrated)
3h calibrated probability of
snow accumulating on roads
Example: Washington, DC Area (F21: 28 February 2005 18Z)
SREF 32F Isotherm
(2 meter air temp)
SREF 32F Isotherm
(Ground Temp)
Mean (dash)
Mean (dash)
Union (dots)
Union (dots)
Intersection (solid)
Intersection (solid)
3h probability of freezing or
frozen pcpn (Baldwin
algorithm; uncalibrated)
3h calibrated probability of
snow accumulating on roads
Verification
Reliability
Economic Potential Value
Reliability Diagram: All 3 h forecasts (F00 – F63); 35 days (Oct 1 – Apr 30)
Summary
• Downscaling of severe weather forecasts are
largely implicit
• Human forecasters downscale by identifying
associations between large-scale environment
and storm-scale hazards
• Objective downscaling plays an increasingly
important role in providing initial forecast
guidance