15. Kaack - Center for Climate and Energy Decision Making
Download
Report
Transcript 15. Kaack - Center for Climate and Energy Decision Making
CEDM Annual Meeting 2016
5/23/2016
An Application of Empirical
Prediction Intervals to Energy
Forecasting
Presenter
Lynn Kaack
Carnegie Mellon University, Department of Engineering and Public Policy
Based on ongoing work with Jay Apta, Granger Morgana and Patrick McSharrybc
aDepartment
of Engineering and Public Policy, Carnegie Mellon University
bSmith School of Enterprise & the Environment, Oxford University
cICT Center of Excellence, Carnegie Mellon University, Kigali, Rwanda
1
How Can We Estimate the Probabilistic Uncertainty?
U.S. Energy Information Administration’s Annual Energy Outlook (AEO)
• Prices, consumption and production rates of relevant energy quantities
• Annual release, project up to 25 years out into the future
Scenarios
Density Forecasts
tell a storyline
of an outcome
give probability
of an outcome
AEO: Reference Case and
Scenarios
2
Research Question
Making a Case for Probabilistic Long-Term Forecasts
Examples from the literature:
•
•
•
Shlyakhter, Kammen, Broido, Wilson (1994): Quantifying the credibility of energy
projections from trends in past data: The US energy sector.
Morgan, Keith (2008): Improving the way we think about projecting future energy
use and emissions of carbon dioxide.
Vahey, Wakerly (2013): Moving towards probability forecasting.
Use of probabilistic projections:
• Decision theory needs probability of alternatives
• Modeling input, Monte Carlo Analysis
• Value at Risk (VaR)
• Real-Options Analysis (ROA)
3
Probabilistic Projections
Examples of Probabilistic Forecasting
Examples:
•
•
•
•
•
EIA Short Term Energy Outlook
Meteorology, demographics
Electric load forecasting
IPCC
Based on retrospective errors:
• Hurricane Track Forecasting
• Bank of England Inflation Rate
Forecasting
Probabilistic Projections
4http://www.nhc.noaa.gov/outreach/presentations/2013_04
nhcL311_verification.pdf
[Britton et al., 1998, Bank of England]
The Race for the Best Uncertainty Estimate
As accurate as possible on unknown future data:
• Density forecast evaluation key component of the analysis
• Scores: CRPS (also consider MAPE and PIT)
Keeping the AEO use in mind:
• From survey of senior advisors to the CMU Electricity Industry
Center: policy analysis, in-house modeling, operational and
technology strategy, revenue projections and budgeting,
research, teaching
Example: ERCOT Transmission Planning (Independent System Operator)
• Use AEO for scenario planning of transmission and generation
• Uncertainty by stakeholder elicitation (often intuitive)
• Intention to move to probabilistic scenarios
5
Probabilistic Projections
Methods to retrospectively estimate the uncertainty
around the forecast
Expert judgment
• Stakeholder involvement and approval
AEO Scenarios
• Choice of quantile (e.g. highest and lowest are 10th and 90th
percentile)
Plus/minus X% heuristic
• E.g. Gaussian with 10th and 90th percentile as +/-20% of the
value
Empirical prediction intervals based on retrospective errors
[Williams and Goodman, 1971]
• Simple error distribution
• Weighted error distribution
6
Methods
Empirical Prediction Intervals: Simple Distribution of
Past Errors
What if we used the past
projection errors?
Scenario
range
AEO 2015
median
scenario range
Natural Gas Consumption
(trillion cubic ft.)
Historical
values
7
Methods
AEO 2015
reference
case
Density Forecast Evaluation Scores
Concept:
• Observed coverage probability equal to the nominal for all
quantiles (90%ile covers 90% of observed actual values)
• Calibration and sharpness
Three scores:
• Mean absolute percentage error (MAPE), for point forecasts
• Probability Integral Transform (PIT)
• Continuous Ranked Probability Score (CRPS)
References:
•
•
•
•
8
Smith et al. (2015)
Gneiting and Katzfuss (2014)
Gneiting and Raftery (2007)
Hersbach H (2000)
Forecast Evaluation
Evaluation: We compare forecasting methods
Point estimate comparison (MAPE):
•
•
•
•
AEO Reference Case
Median of empirical forecasts
Persistence (last observation)
Simple linear regression (moving window of the last 10 observations)
Density forecast comparison (CRPS):
Empirical (non-parametric) methods:
• Simple error distribution
• Weighted error distribution
• Median-centered distribution
Parametric methods:
• Gaussian distribution with standard deviation that increases with horizon 𝜎
(H)=(AEO Reference Case)*5%*sqrt(H+1)
• Gaussian distribution with 𝜎(H)= sd(errors(H))
• Gaussian with 𝜎 as the SD of differences in actual values
9
Forecast Evaluation
Illustrating retrospective evaluation
Divide data:
• Train on early part
of the data set
• Test on later part
of the data set
Prediction interval should
capture actual values
Vertical line divides, move
from 2003 to 2014
10
Forecast Evaluation
Evaluation of Point Forecast Performance: MAPE
Mean Absolute Percentage Error (MAPE)
In 19 quantities comparison with AEO:
Simple linear
regression
Median
weighted
errors
(red)
0.20
0.15
Example figure test range 2003-2014
Natural Gas Cons.
Median
simple
errors
(blue)
MAPE
0.10
• Median of the error distribution
different than AEO forecast
• Median has more accurate
prediction?
0.05
• Persistence: better for many
• Linear regression: 5 Energy
Consumption quantities, coal price
• Empirical prediction interval
median: 3
AEO
reference
case
0.00
Persistence
11
Forecast Evaluation
0
2
4
6
horizon
8
10
Evaluation of Forecast Performance: CRPS
0.14
Natural Gas Cons.
0.10
0.12
MAPE reference
case
12
Forecast Evaluation
Empirical prediction
interval (simple &
weighted)
AEO
scenarios
CRPS
0.06 0.08
Continuous Ranked Probability Score
(CRPS)
• Rewards calibration and sharpness
• Reduces to the absolute error for
point forecasts (MAPE)
• Score applicable to ensemble
forecasts
Example figure test range 2003-2014
In 19 quantities:
• Strongly coupled to MAPE
performance
• Gaussian does well, especially
Gaussian with SD of errors
• Heuristic Gaussian better for larger
horizons (too narrow in the short run)
• Median-centered EPI does well
0.00
0.02
0.04
Mediancentered
error distr.
0
Gaussian
Gaussian
Gaussian SD=5%*sqrt(1+H) sd(errors)
sd(actual
values)
2
4
6
8
10
horizon
Four main results (as of now)
1. AEO point projections perform well in many cases
• Especially for price forecasts
• Sometimes outperformed by simple benchmarks
1. We are able to construct uncertainty intervals that make the
forecasts better
• Variety of methods
• Retrospective errors give a good uncertainty estimate
• AEO scenario range is narrow
1. Not straight-forward answer to the question “which is the best
method for uncertainty?”
2. Out-of-sample forecast evaluation is powerful.
13
Conclusion
Acknowledgements
Many thanks to
• Evan Sherwin, Inês Azevedo, Cosma Shalizi, Alex Davis, Max
Henrion and Stephen Fienberg from Carnegie Mellon University
• Faouzi Aloulou, David Daniels and John Staub from the EIA
• Warren Lasher from ERCOT
Funded by the Electric Power Research Institute (EPRI)
14
References
•
•
•
•
•
•
•
•
•
15
Gneiting, Katzfuss (2014) Probabilistic forecasting. Annual Review of Statistics and
Its Application 1:125–151.
Gneiting, Raftery (2007) Strictly proper scoring rules, prediction, and estimation.
Journal of the American Statistical Association 102(477):359–378.
Hersbach H (2000) Decomposition of the continuous ranked probability score for
ensemble prediction systems. Weather and Forecasting 15(5):559–570.
Lee YS, Scholtes S (2014) Empirical prediction intervals revisited. International
Journal of Forecasting 30(2):217–234.
Morgan MG, Keith DW (2008) Improving the way we think about projecting future
energy use and emissions of carbon dioxide. Climatic Change 90(3):189–215.
Shlyakhter AI, Kammen DM, Broido CL, Wilson R (1994) Quantifying the credibility
of energy projections from trends in past data: The US energy sector. Energy Policy
22(2):119–130.
Smith et al. (2015) Towards improving the framework for probabilistic forecast
evaluation. Climatic Change 132(1):31–45.
Vahey SP, Wakerly L (2013) Moving towards probability forecasting. BIS Paper
(70b).
Williams WH, Goodman ML (1971) A simple method for the construction of
empirical confidence limits for economic forecasts. Journal of the American
Statistical Association 66(336):752–754.