Transcript PPT
3 Gpix Camera
Camera
DAQ/Control
System
SLAC
Program
Review
T. Schalk
CCS team
1
SLAC
June 7 2006
LSST Control Systems
Observatory Control System
Time \ Date
Distribution
Target/Mode Request/Ack.
to/from scheduler
Primary Command Bus
Camera
Control System
Telescope
Control System
Aux. Equip. / Calibration
Control System
Data Mgmt.
Control System
Facility
Database
Status/Data Bus
Status Data Bus
Data transport
Scheduling activities within camera
2
SLAC
June 7 2006
Camera Assembly
Cold Plates
Utility Trunk
BEE Module
Cryostat outer cylinder
Focal Plane fast actuators
Raft Tower (Raft with Sensors + FEE)
L3 Lens in Cryostat front-end flange
Filter Changer rail paths
Shutter
L1/L2 Housing
Camera Base Ring
Filter Carousel main bearing
Filter in stored location
L1 Lens
L2 Lens
Camera Housing
Filter in light path
SLAC
June 7 2006
3
The LSST Focal Plane
3.5 deg
FOV
Guider Sensors
(yellow)
Wavefront
Sensors (red)
3.2 Gpixels
Illumination
Limit
4
SLAC
June 7 2006
Science Data acquisition begins here
Read Out from CCD (16*2 *9 ccd’s = 288 a to d’s)
Full CCD showing segmentation.
5
SLAC
June 7 2006
Design strategy for this system
Control is distributed to the local subsystem level where possible, with time
critical loops closed at the local level.
Subsystem control imbedded in subsystem and communicates with CCS
Master/Slave protocol.
One camera control system (CCS) module (CCM) is the master and
responsible for scheduling tasks and communication with the OCS
Coordination via messages between the CCS and its subsystems.
No direct subsystem to subsystem communication.
Publish/subscribe model.
Separate Command Control Bus and data buses.
Extensive logging capabilities.
Assume need to support engineering and maintenance modes
Accommodations made for test-stand(s) support.
6
SLAC
June 7 2006
Camera Control Architecture
Auxiliary systems
Camera Body
Camera buses
Cryostat
Thermal (T5U)
Thermal (T3U,T4U)
Science DAQ (SDS)
Science array (SAS)
Vacuum (VCS)
Shutter (SCU)
WF DAQ (WDS)
Wave Front (WFS)
Filters (FCS)
Guide Analysis (GAS)
Guide array (GSS)
Power/Signal (PSU)
Lens (L2U)
FP actuation (FPU)
Thermal (T1U,T2U)
Raft Alignment (RAS)
Command
Status
Camera Control (CCS)
Observatory buses
SLAC
June 7 2006
Control Room
7
Subsystems mapping to Managers
Every arrow has an
interface at each end.
OCS
Command
Red means it’s a
CCS group responsibility.
Response
CCS
Subsystem
managers
SAS
SDS
FCS
Data
Similar for
WFS/WDS
and
GSS/GAS
DM
Similar for
TSS, RAS,
SCU, VCS,
and L2U
(see next slide)
Subsystems that produce data.
SLAC
June 7 2006
Subsystems that do not
produce data (only status info).
8
CCD transport Design assumptions
• — The camera’s data are carried on 25 (21?) optical fibers (one per raft)
•
•
•
•
— Data are delivered by the camera to the SDS in 2 seconds.
— These fibers carry only data
— Data flows only from camera to SDS on these fibers (half duplex)
— The fiber protocol is TBD
•
— The data rate from a (fully populated) raft is 281.25 Mbytes/sec
• (2.25 Gbits/sec)
— Total aggregate data (201 CCDs) output rate is 6.432 Gbytes/sec
•
•
•
— Data must be carried from camera and delivered to its client (software)
interface with a latency of not more then one (1) second.
— Interfaces define commodity networking as a MAC layer => trade study
9
SLAC
June 7 2006
CDS Architecture
Camera specific => Standard I/O
10
SLAC
June 7 2006
First detailed designs are for DAQ
11
SLAC
June 7 2006
RNA Hardware layout
a pizza box
SLAC
June 7 2006
12
Simultaneous DMA to memory for speed
13
SLAC
June 7 2006
The data archive will grow at a rate of roughly 7 PB/yr.
Infrastructure Layer
Long-Haul Communications
Archive/Data Access Centers
Base to Archive and Archive to Data Centers
Networks are 10 gigabits/second protected clear
channel fiber optics, with protocols optimized for
bulk data transfer
In the United States. Nightly Data Pipelines and Data
Products and the Science Data Archive are hosted
here. Supercomputers capable of 60 teraflops provide
analytical processing, re-processing, and community
data access via Virtual Observatory interfaces to a 7
petabytes/year archive.
Base Facility
Mountain Site
In Chile,. Nightly Data Pipelines and Products are
hosted here on 25 teraflops class supercomputers
to provide primary data reduction and transient alert
generation in under 60 seconds.
In Chile Data acquisition from the Camera
Subsystem and the Observatory Control System,
with read-out in 2 seconds and data transfer to the
Base at 10 gigabits/second.
14
SLAC
June 7 2006
Application Layer
Data
Acquisition
Infrastructure
Eng/Fac Data
Archive
Calibration
Pipeline
Deep Detect
Pipeline
Image
Processing
Pipeline
Detection
Pipeline
Association
Pipeline
2.5.1.1 Nightly Pipelines and Data Products
Moving Object
Pipeline
Common Pipeline
Components
Alert Data Products are
Nightly Pipelines are executed and
produced within 60 seconds ofArchive
the second exposure of
each visit.
Image
Archive
Source
Catalog
Alert
Processing
Classification
Pipeline
Object
Catalog
2.5.1.2 Science Data Archive
Deep
Object
Catalog
These pipelines
are executed
onMiddleware
a slower cadence and
VO Compliant
Interface
the corresponding data products are those that require
extensive computation and many observations for their
production.
End User
Tools
15
SLAC
June 7 2006
Data Management Organization
• Team is headquartered at LSST Corporation, Tucson
– Project Manager, Project Scientist, Software Engineers
• R&D Team is creating the MREFC, DOE proposals
Application Layer
Middleware Layer
Infrastructure Layer
Caltech IPAC - Application architecture
GMU, LLNL - Community Science scenarios
NOAO - Lensed Supernovae,
Non-moving transients, Photometry
Princeton U - Image Processing, Galaxy
Photometry
U Arizona - Image Processing, Moving
Objects, Association, Photometry
UC Davis - Deep Detection, Shape
Parameters
U Pittsburgh/CMU - Photo Z, Moving Objects
U Washington - Image Processing, Detection,
Classification
USNO - Astrometry
SLAC, JHU - Database
Schema/Indexing, Provenance,
Performance/Scalability (ingest/query)
LLNL, UCB - Database/Pipeline
integration, Pipeline Construction,
Alerting
NCSA - Archive Data Access, Pipeline
Control & Management,
Security
NOAO - Community Data
Access/Virtual Observatory
SDSC - Data Product Preservation
SLAC - Data Acquisition,
Mountain/Base Communications
LLNL - Base Pipeline Server, Data
Base Server
NCSA, BNL - Archive Center/
Data Center Pipeline Servers, File
Servers, Data Access Servers,
Storage, Communications
NOAO - Base to Archive
Communications
• Construction Team will be a Tucson-based
management/functional team, with a small number of singlelocation outsourced implementation teams (e.g. NCSA, IPAC)
16
SLAC
June 7 2006
17
SLAC
June 7 2006
ACRONYMS !!
•
•
•
•
•
•
•
•
•
•
•
•
•
•
CCS camera control system
CCM camera control master/module
OCS Observatory control system
TCS telescope control system
DM LSST data manage system
SAS Science array system
SDS Science array DAQ system
RNA Raft network adapter
SCU Sample Correction Unit
WFS Wave front system
WDS Wave front data system
GSS Guide sensor system
GAS Guide sensor Acquisition system
DSP digital signal processor
•
•
•
•
•
•
•
•
•
FPU Focal Plane actuation
TSS thermal control system
RAS Raft alignment system
SCU Shutter control system
FCS Filter control system
VCS vacuum control system
L2U L2 actuation system
UML Unified Modeling Language
MAC layer medium access control (MAC) Layer,
which provides a variety of functions that support the
operation of local area networking
•
•
•
•
•
FPGA Field-Programmable Gate Array
DMA direct memory access
MGT Multi-Gigabit Transceivers
IBA InfiniBand Architecture
SDR single data rate
18
SLAC
June 7 2006