Presentation

Download Report

Transcript Presentation

Data Quality Monitoring in RA I
Nairobi Regional Meteorological
Center
Eng. Henry Karanja
Email ([email protected]
[email protected])
Data Quality Monitoring in RA I
• The Regional Meteorological Center (RMC)
Nairobi is the regional data quality monitoring
lead center in region 1
• Nairobi has a responsibility of monitoring
Surface pressure which was started in 1995.
• In assessment of data quality, the center
compare the surface pressure information
received from different stations with the firstguess numerical short-term forecast.
Possible sources of errors
•
•
•
•
•
Coding errors.
Incorrect sea-level adjustment for height of barometer.
Corruption during transmission.
Position errors.
Barometric errors:
– Wrong calibration.
- Faulty barometer etc.
Method used for quality Monitoring
• monitoring of surface pressure for RA I being carried out at RMC
Nairobi is based on the results of the UK Met Office Numerical
Weather Prediction (NWP) Model.
• the UK MET Office runs a six-day forecast twice a day and a twoday forecasts grid point model with a horizontal resolution of 25
km in mid-latitudes with 70 levels in the vertical.
• the basis of the data quality monitoring is the observation and
background (First guess) difference (O-B). Systematic errors from
the observations are identified by taking averages of the (O-B) over
sufficiently long period (e.g one month).
• using this method, persistent poor quality observations are
detected
• Besides the difference, calculation of the means and root mean
squares (RMS) are analyzed over a six month period to detect
stations giving persistent erroneous reports.
Error detection criteria
• the (O-B) statistics having been obtained i.e. mean, RMS,
number of observations and percentage gross errors for all
the reporting stations for the six months period, the criteria
for error detection are applied as set up in the Global Data
Processing System (GDPS) manual.
• The cut-off values for error detection depends on the
availability of data and the length of time being monitored.
• The monitoring is done on monthly and six-monthly periods,
each of which has its error detection criteria.
For six-monthly monitoring:
Number of observations = or >40 and one of the
following.
(i) Mean (O-B) = or >3.5 hPa. Or one of the following
(ii) - (a) Standard deviation = or >5 hPa or
- (b) Percentage gross error at least 25%.
NB: The gross error is defined as an observation that
departs from the background by at least 15 hPa.
Dissemination of monitoring report and
feedback Mechanism
• The designated focal point generates six-monthly reports on
suspected low quality stations and forwards the report to the
WMO secretariat through the Permanent Representative of
Kenya with WMO.
• These reports are distributed by WMO to Members so that
they can take appropriate remedial action with the assistance
of the national designated focal point.
• These Members/agencies then report to lead centers and the
WMO Secretariat on their remedial efforts.
Effect of Monitoring
• From the experience of the regional centre, the method of
monitoring is effective and in some cases led to the
improvement of data quality in the region.
• However, due to the terrain of some of the affected stations,
some stations have not been able to take remedial action to
rectify the situation. In such cases, members have been
advised to follow the available guidelines in the CIMO guide
and other relevant manuals and guides in siting of their
stations.
• monitoring report 2011.doc