Transcript Document
Ground segment:
PLATO Data Analysis System
Laurent Gizon
and PDAS Assessment Study Team
Lead: MPS (Germany) and IAS (France); with contributions from
Aarhus (Denmark), AIP (Germany), DLR (Germany), Cambridge (UK),
LAM (France), Leicester (UK), Leuven (Belgium), MPIA (Germany)
Rome, 5 May 2009
Introduction
The PLATO Data Analysis System (PDAS) on the
ground is in charge of the validation, calibration, and
analysis of the PLATO observations. It delivers the
final science data products.
The PDAS comprises
a Mission Operations Center (MOC, flight-critical)
a Science Operations Center (SOC, mission-critical)
a PLATO Data Center (PDC, science-critical)
The assessment study of the PDAS benefits from
extensive experience in the design of data centers for
CoRoT, GAIA, Kepler, SDO, and WASP.
PDAS Data Products
Baseline: Light curves downloaded for all stars and all 40+2
telescopes, and ~1000 imagettes at high cadence
Validated light curves (Level 0)
for all stars
Validated light curves and centroid curves for the 40+2 telescopes
Flux calibrated light curves (Level 1)
For all stars.
NT flux-calibrated light curves and the centroid curves for each star,
averaged over all 40 telescopes and their associated errors
Two FT calibrated light curves and centroid curves for each star
Data quality parameters.
improved, specific for stars for which imagettes are available
PDAS Data Products (cont.)
Asteroseismic mode parameters
For most stars
Frequencies, amplitudes, and lifetimes of the modes of
oscillation.
From fits to spectra of stellar oscillations.
Stellar rotation and stellar activity
Rotation periods from activity-induced periodicities.
Whenever possible, characterization of stellar activity:
activity level from low-frequency power spectrum, star spot
models.
Stellar masses and ages
For cool stars with magnitude less than 11.
Stellar parameters are obtained from stellar model fits to the
frequencies of oscillation
Also, chemical composition etc. (seismo + spectro)
PDAS Data Products (cont.)
Transit candidates and their parameters
List of transit candidates,
List includes candidates from centroid curves (astrometry)
Ranking of candidates according to planetary likelihood,
Basic characteristics of the transits (depth, duration, period, and
ephemerids).
Planetary systems and their characteristics
The most important PLATO deliverable
List of confirmed planets, using follow-up observations
Assessment of false alarm probability
Potentially several hundreds of planetary systems for which the
seismology of the central stars is possible.
Determination of the planet parameters: orbital parameters,
planet size, mass, density (average composition), age (from
central stars)
Any additional characterization of planet properties from followup observations and light curves analysis, e.g. planetary
atmospheres etc.
Data
Products
Ancillary
observations:
star catalog
stellar parameters
follow-up observations
spectroscopy
radial velocities
interferometry
astrometry (Gaia)
Work Packages
WP1. Project office, system architecture, archives, database,
system management
WP2. Science data releases, export system, data access and
distribution
WP3. Pipeline, workflow management system
WP4. Data flow management, network
WP5. Simulation of data stream (uses simul. of telemetry as input)
WP6.Development of software: validation of L0 data
WP7. Validation of L0 data (operational task).
WP8. Software development: processing of L1 data
WP9. L1 data processing (operational task).
WP10. Set-up and maintain ancillary data base
WP11-15 Scientific software development:
Determination of asteroseismic mode parameters (WP11), Stellar
rotation and stellar activity (WP12), Masses and ages of stars (WP13),
Transit candidates and their parameters (WP14), Planetary systems
and their characteristics (WP15)
WP5 Development of software for
validation of Level 0 data
Validate onboard software:
Validate onboard setup:
Check onboard processing using ground copy of
onboard software and the imagettes of ~1000 stars
Validate distortion matrix model
Validate 2D sky background model
Validate PSF model fits
Validate computation of masks and windows
Fine tuning of onboard software algorithm. For
example choose number of parameters needed to
describe PSF. Especially during configuration mode.
Monitor health of each telescope and assess
quality of the data (information may have to be
reported to the PDAS Steering Committee if a
problem is identified)
WP7 Development of software for
validation of Level 1 data
If required, Gain corrections
Correction for jitter and residual differential aberration.
Performed independently for each telescope; requires
PSF knowledge, stellar catalog, and distortion matrix.
Integration time correction, sampling time correction
Statistical analysis over the 40 telescopes to identify
cosmic ray hits, hot pixels, and possibly defectuous
telescopes
Average light curves and centroid curves over all
telescopes (weighted average).
Compute error based on scatter
The ~1000 stars for which imagettes are available receive
a more sophisticated treatment. PSF fits to improve
photometry (contamination from neighboring sources
taken into acount). Imagettes are downloaded for all
stars for which a serious planetary candidate has been
identified.
Long term detrending
PDAS Work Packages
Costed in payload study
ESA responsibility
Partially funded through
Research institutions
Distribution of work: TBD with essential
contributions from MPS (Germany) and
IAS (France)
PDAS Work Packages
Mission critical (SOC)
Science critical (PDC)
Costed ín payload study
ESA responsibility
Partially funded through
Research institutions
PLATO Consortium
Council
PLATO
MOC
SOC
Validated light curves (L0)
PDC
PDAS
PSC
PDAS
Steering
Committe
e
Users
Calibrated light curves (L1)
Science data products
Ancillary data base
(PDC likely to be distributed
among a few centers)
Steering Committee: takes decisions at
PDAS level. One representative from each
work package, instrument scientists,
representative for follow-up observations,
and a few specialized scientists.
Users
Observatories
Data volumes
Telemetry rate: 109 Gb/day uncompressed
Over a 6 yr mission: 30 TB uncompressed
The volume of archived L0, L1 and HK data is
expected to be 10-50 times this amount (reformatting
and calibration history), i.e. 300-1500 TB
The volume of the science data products is likely to
be negligible in comparison (although the complexity
of the data may be high).
Ancillary data base: basic stellar observations and
parameters, spectra, Gaia observations, etc.
The overall data volume will not exceed a few PB,
which is not problematic.
Time schedule
January 2012. Setup of project office and start of studies
June 2012. PDAS System Requirements Review
June 2013. PDAS Preliminary Design Review
June 2016. PDAS Critical Design Review
June 2017. PDAS Flight acceptance Review
December 2017: Launch of PLATO
3+2+1 years in space
Several releases of science data products during and after space
mission
After end of mission in space: Several years of follow up observations
to confirm a planet with e.g. T=3 yr. During this time the PDC must
remain operational
Cost
9 WPs are costed in the current study: WP1-6 and WP8-10
~25 FTEs over 15 years (2012-2023+3)
Hardware (~100 cores, PB storage), network, software: ~1 MEUR
Total cost is around 30 MEUR
Not costed:
MOC
WP7 operational task under ESA responsibility
WP11-15 Scientific software development (research institutions).
Resources for stellar model grid computations
Conclusion
The function of the PDAS is to validate and calibrate the
PLATO light curves and to deliver the science data
products
Data volume: ~PB
Compute power (L0 and L1 processing): ~100 cores
The system software and hardware technologies exist
today.
Cost: ~30 MEUR over 2012-26
The development of the scientific software will benefit a
lot from the CoRoT, Gaia, and Kepler experiences.
Overall, the PDAS is relatively low-risk enterprise