20100921144515309-152610

Download Report

Transcript 20100921144515309-152610

Using a perturbed physics ensemble to
make probabilistic climate projections
for the UK
Isaac Newton workshop, Exeter
David Sexton
Met Office Hadley Centre
September 20th 2010
© Crown copyright Met Office
UKCP09: A 5-dimensional problem
Variables and months
Seven different timeframes
25km grid, 16 admin
regions, 23 river-basins
and 9 marine regions
Three different
emission
scenarios
This is what
users
requested
Uncertainty including information
from models other than HadCM3
Why we cannot be certain…
• Internal variability (initial condition uncertainty)
• Modelling uncertainty
– Parametric uncertainty (land/atmosphere and ocean
perturbed physics ensembles)
– Structural uncertainty (multimodel ensembles)
– Systematic errors common to all current climate
models
• Forcing uncertainty
– Different emission scenarios
– carbon cycle (perturbed physics ensembles)
– aerosol forcing (perturbed physics ensembles)
Production of UKCP09
predictions
Other models
Equilibrium
PPE
Observations
Equilibrium
PDF
Time-dependent
PDF
25km PDF
UKCP09
Simple Climate
Model
25km regional
climate model
4 time-dependent Earth
System PPEs (atmos,
ocean, carbon, aerosol)
Stage 1: Uncertainty in
equilibrium response
Other models
Equilibrium
PPE
Observations
Equilibrium
PDF
Time-dependent
PDF
25km PDF
UKCP09
Simple Climate
Model
25km regional
climate model
4 time-dependent Earth
System PPEs (atmos,
ocean, carbon, aerosol)
Perturbed physics ensemble
• There are plenty of different variants of the
climate model (i.e. different values for model
input parameters) that are as good if not better
than the standard tuned version
• But their response can be different to the
standard version
• Cast the net wide, explore parameter space with
view to finding pockets of good quality parts of
parameter space and see what that implies for
uncertainty
Perturbed physics ensemble
280 equilibrium runs, 31 parameters
Parameters
varied
within
ranges
elicited
from
experts
Probabilistic prediction of equilibrium
response to double CO2
Stage 2: Time Scaling
(Glen Harris and Penny Boorman)
Other models
Equilibrium
PPE
Observations
Equilibrium
PDF
Time-dependent
PDF
25km PDF
UKCP09
Simple Climate
Model
25km regional
climate model
4 time-dependent Earth
System PPEs (atmos,
ocean, carbon, aerosol)
Ensembles for other Earth
System components
17 members of
Atmosphere
Perturbed Physics
Ensemble repeated
with a full coupling
between atmosphere
and dynamic ocean
Use ocean, sulphur
cycle, carbon cycle
PPEs and multimodel
ensembles to tune
different
configurations of the
Simple Climate Model
Making time-dependent PDFs
• Sample point in atmosphere parameter space
• Emulate equilibrium response in climate
sensitivity and prediction variables and calculate
weights
• Sample ocean, aerosol and carbon cycle
configurations of Simple Climate Model
• Time scale the prediction variables
• Adjust the weight according to how well model
variant reproduces large scale temperature
trends over 20th century
• And repeat sampling…
Plume for GCM grid box over
Wales
Stage 3: Downscaling
(Kate Brown)
Other models
Equilibrium
PPE
Observations
Equilibrium
PDF
Time-dependent
PDF
25km PDF
UKCP09
Simple Climate
Model
25km regional
climate model
4 time-dependent Earth
System PPEs (atmos,
ocean, carbon, aerosol)
Dynamical downscaling
• For 11 of the 17
atmosphere fully
coupled oceanatmosphere runs,
use 6-hourly
boundary
conditions to drive
25km regional
climate model for
1950-2100
Adding information at 25km scale
•
High resolution regional climate model projections are used to
account for the local effects of coastlines, mountains, and other regional
influences.
• They add skilful detail to large scale projections from global climate
model projections, but also inherit errors from them.
Stage 1: Uncertainty in
equilibrium response
Other models
Equilibrium
PPE
Observations
Equilibrium
PDF
Time-dependent
PDF
25km PDF
UKCP09
Simple Climate
Model
25km regional
climate model
4 time-dependent Earth
System PPEs (atmos,
ocean, carbon, aerosol)
Bayesian prediction –
Goldstein and Rougier 2004
• Aim is to construct joint probability distribution
p(X, mh , mf ,y,o,d) of all uncertain objects in
problem.
•
•
•
•
•
Input parameters (X)
Historical and future model output (mh,mf)
True climate (yh,yf)
Observations (o)
Model imperfections (d)
• Bayes Linear assumption so all objects
represented in terms of means and covariances
Best-input assumption
(Goldstein and Rougier 2004)
• Model not perfect so there are processes in real
system but not in our model that could alter
model response by an uncertain amount.
• We assume that one choice of values for our
model’s parameters, x*, is better than all others
y 
True climate
f ( x*)  d
Model output of
best choice of
parameter
values x*
Discrepancy
d=0 for
perfect model
•But each
combination
has a prior
probability of
being x*, so
build and
emulator
Weighting different model
variants
•Use
observations to
weight towards
higher quality
parts of
parameter
space
•No verification
or hindcasting
possible so we
are limited to
this use of the
observations
Emulated
distributions
for 10
different
samples of
combination
s of
parameter
values
•But each
combination
has a prior
probability of
being x*, so
build and
emulator
Weighting different model
variants
•Use
observations to
weight towards
higher quality
parts of
parameter
space
•No verification
or hindcasting
possible so we
are limited to
this use of the
observations
Emulated
distributions
for 10
different
samples of
combination
s of
parameter
values
Large scale patterns of
climate variations
The first of
six
eigenvectors
of observed
climate used
in weighting.
A small
subset of
climate
variables are
shown
Constraining parameters
Second way to constrain
predictions with observations
 f |o   f ( x)   fh ( x)[ hh ( x)]1 (o   h ( x))
 f |o   ff ( x)  fh ( x)[ hh ( x)]1 hf ( x)
where
 and  are mean and total
covariance (discrepancy +emulator+observed)
h and f subscripts denote historical and future
x is value of parameters
Some uncertainty about future related to uncertainty about past and is removed
when values for real world are specified
Specifying the discrepancy
Method does
not capture
systematic
errors that
are common
to all state-ofthe-art
climate
models
The largest discrepancy impact for UK
temperature changes: Scotland in March
Example of large shift in PDF due to mean discrepancy indicating a
bias in HadCM3 relative to other models
Discrepancy Term: Snow Albedo Feedback in
Scotland in March
Observations
Black crosses: Perturbed physics
ensembles, slab models
Red asterisks: Multi model slab
ensemble
Black vertical lines: Observations
(different data sets)
But what if…
Should this be captured by the adjustment term or should
the discrepancy be a function of x?
Observations
Black crosses: Perturbed physics
ensembles, slab models
Red asterisks: Multi model slab
ensemble
Black vertical lines: Observations
(different data sets)
Making PDFs for the real world
P( y  Y | o)   p( y  Y | f ) p ( f | o) p (o | x) p( x)dx

Real world PDF
Discrepancy
adj emulated
likelihood
• Start with prior which comes from model output
• Weighting by large scale metrics plus adjustment
– But what about local scale like control March Scotland
temperature. ENSEMBLES show May Sweden temperature
similar behaviour so maybe need a new metric to capture this
behaviour
• Discrepancy – a direct link between model and real
world
• Downscaling – statistical or dynamic
prior
PDFs for the real world (ii)
• Quantile matching
– Piani et al (WATCH project. Submitted)
• Transfer function applied to model data in baseline
period so that CDF of transfer(model data) =
observed CDF
– Li et al (2010).
• Correct future percentile by removing model bias in
that percentile in baseline period . But what if you
cross a threshold?
Crossing a threshold
(Clark et al GRL 2010)
PDFs for the real world (iii)
• Model soil too dry, so longer tail in daily
temperatures than observed. This tail does not
change much under climate change because still dry
• Observed soil not too dry, but if climate change
causes soil to dry below threshold, there will be a
big increase in the upper tail of daily temperatures
• Li et al would remove a large baseline bias in the
upper tail from future model CDF (whose tail is
similar to baseline model CDF)
• Another perspective: Under climate change, model
and real world will both be dry, as they are in the
same “soil-regime” and future bias < baseline bias
• Same applies to March Scotland temperature
though it is the model that crosses the threshold
PDFs for the real world (iv)
• Build physics into the bias correction
• Buser et al (2009) use interannual variability
• Seasonal forecasting
– Clark and Deque – use analogues to calibrate bias
correction in seasonal forecast
– Ward and Navarra – SVD of joint vector of
forecast/observations for several forecasts to pick out
which leading order model patterns correspond to
which leading order observed patterns
Summary
• IPCC Working group II scientists use “multiple lines of
evidence” to help users make adaptation decisions
• UKCP09 is a transparent synthesis of climate model
data from Met Office and outside, plus observations
• Statistics provides us with a nice way to frame the
problem, generate an algorithm, makes sure we are not
missing any terms, and gives us a language to discuss
the problem
• A real challenge though is to develop the statistics to
better represent complex behaviour that we understand
physically e.g crossing a “threshold”.
Any questions?
Weighting
•Using 6
metrics
reduces risk of
rewarding
models for
wrong reasons
e.g. fortuitous
compensation
of errors
Dots indicate:
280 values
from perturbed
physics
ensemble
12 values from
multimodel
ensemble
Observed
value
Second eigenvector of
observed climate
A small
subset of
climate
variables
are shown
Third eigenvector of
observed climate
A small
subset of
climate
variables
are shown
Comparing models with
observations
• Use the Bayesian framework of Goldstein and
Rougier (2004)
• “Posterior PDF = prior PDF x likelihood”
• Use six “large scale” metrics to define likelihood
• Skill of model is likelihood of model data given
some observations
n
1
log Lo (m)  c  log | V |  (m - o)T V 1 (m - o)
2
2
V = obs uncertainty + emulator error + discrepancy
Testing robustness
Changes for
Wales, 2080s
relative to 196190
• Projections inevitably depend on
expert assumptions and choices
• However, sensitivities to some
key choices can be tested
Reducing different
sources of uncertainty?
Uncertainties
in winter
precipitation
changes for
the 2080s
relative to
1961-90, at a
25km box in
SE England
New information, methods, experimental design can reduce uncertainty so
projections will change in future and decision makers need to consider this
Discrepancy Term: Snow Albedo Feedback in
Scotland in March
Observations
Black crosses: Perturbed physics
ensembles, slab models
Red asterisks: Multi model slab
ensemble
Black vertical lines: Observations
(different data sets)
Adjusting future temperatures
Consider surface
energy balance
T 4 S  (1   )  LW   SH  LE
 (T f4  Tc4 )  S  (Tc )   (T f )    LW , SH , LE
Now adjust control temperatu re by mean discrepanc y,
Tcadj  Tc   c
Now solve


 (T f4  Tc4 )  S  (Tc  c )   (T f )   LW , SH , LE
adj
adj
adj
Comparison of methods
+ Raw QUMP
data
 UKCP09
method
 New energy
balance based
method
Interannual results v. 30-year
mean results
Predictions are for 30-year means, so should not be compared to annual
climate anomalies.
Summer % rainfall change: a) interannual over SE England from 17 runs
b) time-dependent percentiles of 30-year mean at DEFRA
Effect of historical discrepancy
on weighting
Discrepancy included
Estimated
from sample
size of 50000
excluded
Discrepancy – a schematic of
what it does
• Avoids contradictions from subsequent
analyses when some observations have been
allowed to constrain the problem too strongly.
UKCP09 aerosol forcing uncertainty
Aerosol forcing
Sample of UKCP
is found forcing
to be
A1B-GHG
inversely
proportional to
climate
sensitivity and
this, along with
perturbations
to sulphur
cycle, implies
Q (Wm-2),
a distribution
2000-2010,
of aerosol
aerosol
+ solar +
volcanic
forcing + ozone
uncertainty in
UKCP09
From IPCC
Fourth
assessment
report
Different scales
x
Fig. 2.20, AR4,
IPCC: total
aerosol forcing
in 2005, relative
to 1750.
Model imperfections in Bayesian
prediction (Goldstein and Rougier 2004)
• Define discrepancy as a measure of the extent to which
model imperfections could affect the response.
• Assumes there exists a best choice of parameter values
• Discrepancy is a variance and it measures how
informative the climate model is. A perfect model has
zero discrepancy.
• Discrepancy inflates the PDFs of the prediction variables
• Discrepancy makes it more difficult to discern a good
quality model from a poor quality model and so avoids
over-confidence in weighting out poor parts of parameter
space
• But how to specify it?
Bayesian prediction –
Goldstein and Rougier
• Aim is to construct joint probability distribution p(X, mh ,
mf ,y,o,d) of all uncertain objects in problem.
•
•
•
•
•
Input parameters (X)
Historical and future model output (mh,mf)
True climate (yh,yf)
Observations (o)
Model imperfections (d)
• Probability here is a measure of how strongly a given
value of climate change is supported by the evidence
(model projections, observations, expert judgements
informed by understanding)
Constraining predictions
• Weighting particularly
effective if there exists a
strong relationship
between a historical
climate variable and a
parameter AND that
parameter and a future
climate variable. So
weighting can still have
a different effect on
different prediction
variables.
Observed
value
We use 6 eigenvectors
Comparison of
weight
distributions
varying the
dimensionality
of the likelihood
function for
Monte Carlo
samples of 1
million points.
Weights come from likelihood function.