VLBI: connecting national radio telescopes into a global array

Download Report

Transcript VLBI: connecting national radio telescopes into a global array

Research Networks and
Astronomy
Richard Schilizzi
Joint Institute for VLBI in Europe
([email protected])
Three types of use
•
Transport of raw data from telescope(s) to data
processing facility or database
•
distribution of data from processing facility or
database to users
•
“mining’’ of databases
Transport of raw data
Required capacity is driven by radio telescope arrays
• national scale
e-MERLIN
LOFAR
• European scale
eEVN
• global scale
SKA
Hubble Deep
radio
in theofcentre
of the
deepsource
in the heart
the galaxy
Field
(zoom galaxy
factor=1000)
2500 galaxies
in a black
smallhole?
area of sky
Data mining in astronomy
Current
Future
ASTROVIRTEL
Astrophysical Virtual Observatory
• dataset sizes 100’s gigabytes to 100’s terabytes at
multiple sites
• database access needs several gigabit/sec pan-European
connectivity
• distributed computing via the Grid
e.g. astrometric satellite GAIA
eEVN: a real-time connected radio telescope as
large as Europe
plans to use the “Grid infrastructure” for
- transporting raw data-streams from the telescopes to the
central data processor at JIVE (via GÉANT)
- real-time control of the distributed observing process
- distributing processed data to scientists
- data mining of archives
to provide
– new astronomical capabilities
– operational reliability and flexibility
Science impact
• wide bandwidth that is always available major increase in
sensitivity for sources at the edge of the universe
•wide bandwidth
very high quality imaging
• flexible, dynamic scheduling essential for making movies of variable
sources like exploding stars
supernova in
M81in 1993
(Bietenholz et al)
Operational impact
•improved reliability
•easier data logistics
•flexible scheduling
•lower operating costs
•more effective network monitoring
eEVN pilot project: 2001 to 2004
•link 4 of the 14 EVN telescopes (and possibly 1 US telescope) to
JIVE using as much off-the-shelf technology as possible
- bit rates up to 1 Gbps with latency < 1 second
- network monitoring and astronomical end-to-end testing for
several periods of weeks at a time
Technology challenges
• networking at multiple gigabit/sec via concatenated
national, regional and pan-European research
networks including Géant++. Quality of Service.
• last mile connections to remote telescope sites
• will Grid architectures and middleware be adequate
for the expected data traffic and distributed
computing?
• networking at terabit/sec rates on the longer term
Summary
•radio astronomy interferometry is a novel application of
Grid capabilities for sustained data transfer at high bit
rates from the telescopes to the central data processor
•VLBI network characteristics:
heterogeneous, multipoint to point, asymmetric
•Challenges to be met:
- international connectivity at 1 Gbps
- latency
- cost (including last mile connections)
Radio telescope arrays
•
networks of radio telescopes spread over
100’s to 1000’s of km provide zoom
lenses for astronomers
•
and gives them the most detailed
pictures of distant stars and galaxies
available to mankind
•
technique is called Interferometry
e-MERLIN
• Dedicated optical fibres to
connect telescopes to Jodrell Bank
Observatory near Manchester
• Sustained data rates of 30
Gbps/telescope to new data
processor
LOFAR Configuration
total data rate to centre
~ 20 terabit/sec
Log-spiral distribution, 300 km
eEVN: European VLBI Network
Data
processing
centre
32 - 256 Gbps
China
USA
1-8 Gbps
Network characteristics
• multi-point to point
• asymmetric
• heterogeneous
South
Africa
how do we currently
this?
VLBI configuration
do
Difference in time of
arrival
Recorder +
atomic clock

telescopes in different
countries

Recorder +
atomic clock
astronomical data at
< 128 MB/s
data recorded on
“standard” tape at 1 Gbps
and transported
to a central
location
(300 tera-bytes/day)

cross multiplication =
signal detection
data processor
multiplies and adds at a rate
of 1014 ops/sec
SKA
Square Kilometer Array
global
collaboration
technical
concepts under
evaluation
operational in
2015
data rates up to
terabit/sec
ESO
Very Large Telescope
• 4 x 8-m optical
telescopes on
Paranal in Chile
• adaptive optics
• IR spatial
interferometry
• all four elements
in operation
• HQ in Germany
GAIA Satellite and System
a high precision census of more than a
billion stars in our Galaxy
•Launch date: 2010
• Data rate:
1 Mbs-1 sustained
3 Mbs-1 downlink (1 ground
station)
•Design lifetime: 5 years
• ESA only mission
GAIA Data Analysis: Concept and Requirements
Capacity: ~100-500 Terabytes (20 TB of raw data)
Overall system: centralised global iterative reduction approach
Accessibility: quasi-random, in both temporal and object domains
Processing requirements: entire task is ~1019 flop
Data base structure: e.g. Objectivity (cf. Sloan, CERN, etc)
Time critical: some results available in minutes (near-Earth asteroids, supernovae etc)
Challenge: complexity of algorithms; inter-dependence of data; volume of data on-line