Splinter Session - Space Weather Metrics, Verification
Download
Report
Transcript Splinter Session - Space Weather Metrics, Verification
Splinter Session – ‘Space Weather
Metrics, Verification & Validation.’
Thursday 20th Nov., 16:30 - 18:00
Splinter session - Space weather metrics, verification & validation
Agenda
16:30 Introduction
16:35 Towards a verification framework for forecasts of different centres, Andy Devos, J. Andries, C. Verbeeck, D. Berghmans (STCE-ROB),
Belgium.
16:45 Verification of extreme event forecasts, Institute of Space Physics, Peter Wintoft, IRF, Sweden.
16:50 SWENET Index Quality Statistics & Database Assessment, Alexi Glover, ESA.
16:55 “Performance Verification of Solar Flare Prediction Techniques: Present Status, Caveats, and Expectations”, Manolis Georgoulis, Academy
of Athens, Greece.
17:00 The use of modified Taylor diagrams for comparing geophysical models, Matthew Angling, University of Birmingham, UK.
17:10 Translating verification experience from Meteorology to Space Weather, Suzy Bingham, Met Office, UK.
17:20 Lessons learned from CCMC-led community-wide model validation challenges. Outlook on international coordination of M&V activities,
Maria Kuznetsova, CCMC , USA.
17:30 Discussion.
18:00 Finish.
Splinter session - Space weather metrics, verification & validation
Aims
·
To understand current techniques used in space weather validation, verification &
metrics.
·
To discuss key physical parameters & products which require validation &/
verification.
·
To ascertain the validation, verification & metric techniques required to move
forward.
·
To determine the support required by the community.
Splinter session - Space weather metrics, verification & validation
Definitions
• Validation: process of determining the degree to which a product or service (including
potentially software & associated data) accurately represent the real world from the
perspective of the intended use(s).
E.g. the accuracy of the output of a model compared with ‘truth’ data.
• Verification: process of determining that a system or service (including potentially
software, implementation & associated data) perform as expected.
E.g. for redeployment of model in a different location—to confirm that a model accepts
the full range of inputs, etc.
• Metrics: statistical parameter. Scientific metrics are related to specific key parameters
e.g. Skill scores, Probability Of Detection (POD), False Alarm Ratio (FAR).
Application metrics include scientific metrics, extend to overall service performance & are
essentially KPIs (Key Performance Indicators), e.g. accuracy & confidence in a service.
Splinter session - Space weather metrics, verification & validation
Topics covered in last year’s plenary session on ‘Forecast Verification’
• Forecast verification applied by forecast providers;
• Future development of forecast verification; communicating verification results and requirements
between users and providers
• Metaverification -- evaluating performance measures;
• Magnetic and other space weather indices;
• Ground effects of space weather;
• Ionospheric and magnetospheric processes;
• Solar and interplanetary data.
Splinter session - Space weather metrics, verification & validation
Presentations...
Splinter session - Space weather metrics, verification & validation
Discussion topics
(1) What metrics and validation techniques are required in the current space weather
landscape?
(2) What are the key challenges currently in model and forecast benchmarking?
(3) What direction should the space weather community be taking?
(4) What actions can agencies and organisations take in order to support a wider space
weather validation effort?
(5) How to establish agreed realistic model/service targets to encourage targeted
development and prototyping?
(6) What targeted actions would encourage groups not currently involved to further
participate in space weather validation activities?
Splinter session - Space weather metrics, verification & validation