Controls and Monitoring Status Update

Download Report

Transcript Controls and Monitoring Status Update

Controls & Monitoring Status
Update
J. Leaver
05/11/2009
Infrastructure
Infrastructure Issues
• Control PCs & servers
• Application management
– Client-side application launcher
– Soft IOC run control
• Configuration Database (CDB)
– EPICS Interface
– Ensuring the validity of recorded run parameters
• Protocols for updating & maintaining software on control
PCs (see PH’s talk)
• Alarm Handler (see PH’s talk)
• Channel Archiver (see PH’s talk)
• Remote Access (see PH’s talk)
11/04/2016
Imperial College
3
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
miceecserv
target2
11/04/2016
miceopipc1
Imperial College
miceioc1
miceioc5
miceserv1
micecon1
4
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
miceioc1
miceioc5
• General purpose server PC
– Currently runs EPICS servers for:
• FNAL BPMs
• DATE Status
miceecserv
miceserv1
• Config. DB User Entry Data
• CKOV Temp. & Humidity Monitor
• DSA Neutron Monitor
– Will also run EPICS servers for:
• Network Status
• CKOV target2
+ TOF CAEN HV Supplies
miceopipc1
11/04/2016
Imperial College
micecon1
5
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
miceioc1
miceioc5
• Target DAQ & Control PC
– Currently runs:
• Target / Beam Loss Monitor DAQ
– Will run EPICS servers for:miceecserv
miceserv1
• Beam Loss Monitor
• Target Controller
target2
11/04/2016
miceopipc1
Imperial College
micecon1
6
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
•
miceioc2
miceioc4
miceioc1
miceioc5
Target Drive IOC (vxWorks)
– EPICS server for Target PSU &
extraction motor
miceecserv
target2
11/04/2016
miceopipc1
Imperial College
miceserv1
micecon1
7
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
•
miceioc4
miceioc1
miceioc5
Beamline Magnets IOC (vxWorks)
– EPICS server for Q1-9, D1-2
miceecserv
target2
11/04/2016
miceopipc1
Imperial College
miceserv1
micecon1
8
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
•
11/04/2016
miceioc5
Decay Solenoid IOC (vxWorks)
miceecserv
target2
miceioc1
miceopipc1
Imperial College
miceserv1
micecon1
9
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
•
miceecserv
target2
11/04/2016
miceopipc1
Imperial College
miceioc1
miceioc5
Linde Refrigerator IOC (PC)
miceserv1
micecon1
10
Control PCs & Servers: Current Status
• ‘EPICS Client’ Server PC
miceiocpc1
miceioc2
miceioc4
miceioc1
–target1ctl
Runs all client-side
control
& monitoring
applicationsmiceioc5
– Runs infrastructure services:
• Alarm Handler
• Channel Archiver
miceecserv
miceserv1
• Large wall mount display shows:
– Alarm Handler panel
– Log message viewer
• Display may also be used to show any (nontarget2
miceopipc1
micecon1 that must
interactive)
panel containing information
be monitored for the duration of a specific run
11/04/2016
Imperial College
11
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
miceioc2
miceioc4
miceioc1
miceioc5
• Gateway / Archiver Web Server PC
– Runs Channel Access Gateway,
providing read-only access to PVs
between MICE Network & heplnw17
– Runs web server enabling read-only
access to Channel Archiver
data
miceecserv
miceserv1
– Currently running non-standard OS for
control PCs
• Will reformat after current
November/December run period
– See PH’s talk
target2
11/04/2016
miceopipc1
Imperial College
micecon1
12
Control PCs & Servers: Current Status
miceiocpc1
target1ctl
•
miceioc2
miceioc4
miceioc1
miceioc5
General purpose Operator Interface PC
– Primary access
point for users to interact with
miceecserv
miceserv1
control & monitoring panels
– Essentially a ‘dumb’ X server – runs all
applications via SSH from miceecserv
target2
11/04/2016
miceopipc1
Imperial College
micecon1
13
Control PCs & Servers: Current Status
miceiocpc1
•
target1ctl
miceioc2
miceioc4
miceioc1
miceioc5
Additional General purpose Operator Interface PCs
– Currently running non-standard OS for control PCs
– Useable, but not optimally configured…
miceecserv
miceserv1
– Cannot disturb at present
- will reformat after current
November/December run period
• See PH’s talk
– Shall be renamed miceopipc2 & miceopipc3
target2
11/04/2016
miceopipc1
Imperial College
micecon1
14
Application Launcher
• New application launcher
replaces DL TCL script
– XML configuration file,
easy to add items
– Unlimited subcategory
levels
• Provides real-time
display of application
status
• Configurable response to
existing application
instances
11/04/2016
Imperial College
15
Application Launcher: App Status
Application was previously launched
by an external process, but is no
longer running (return value
unknown)
Application was killed by a signal & is
no longer running
Application is running, but was not
executed by this launcher
Application is running
Application quit with an error code &
is no longer running
11/04/2016
Imperial College
16
Application Launcher: External App
Response
• Multiple application launchers will be operated simultaneously
– On miceopipc1-3 (via SSH from miceecserv) & miceecserv itself
• Need to ensure that shifters using different launchers do not ‘conflict’
• If operator attempts to execute an application that is already running,
launcher has a configurable response:
– Ignore:
Launch another instance
– Inhibit:
Prevent another instance from running
– Kill:
Close existing instance & run a new one (e.g. could be used for a
‘master’ override control panel)
• Typical configuration:
– Only one instance of each ‘control’ application may run (cannot have
multiple users modifying the same parameter!)
– Unlimited numbers of monitoring panels may be opened
11/04/2016
Imperial College
17
Soft IOC Management
• Application launcher primarily concerned with managing client-side control
& monitoring panels running on miceecserv
• Also need to implement run control for corresponding EPICS servers
– ‘Hard IOCs’ running on vxWorks (i.e. servers provided by DL) are always ‘on’
→ require no routine external intervention
– ‘Soft IOCs’ running on control PCs (i.e. servers produced within the
Collaboration) are executed like any other application → require user control
• Why can’t soft IOCs just run at system start-up, like any other service?
– Assumes that servers run perpetually, unattended – not true!
– Sometimes need to modify configuration files, requiring server restart
– Servers sometimes crash due to hardware problems, requiring restart
– May need to turn off or reconfigure hardware – cannot do this while a soft IOC
is running
– Shifters should not have to worry about Linux service management…
11/04/2016
Imperial College
18
Soft IOC Management
micetk1pc
miceiocpc1
CmdExServer
FNAL BPM Server
target1ctl
CmdExServer
CmdExServer
Beam Loss Monitor Server
AFEIIt Server
Target Controller Server
DATE Status Server
Config. DB User Entry Data Server
Network Status Server
etc.
• CmdExServer provides similar functionality to
normal application launcher, but with an EPICS
interface
• Each IOC PC runs an instance of the CmdExServer
at start-up
• CmdExServer manages local soft IOCs
• Client-side ‘remote application launcher’
communicates with all CmdExServers & allows user
to start/stop IOCs
11/04/2016
Imperial College
NB
- Current remote launcher
configuration only includes
a subset of the servers
assigned to miceiocpc1
- Others will be added as
they become available
19
Configuration Database: EPICS
Interface
• Custom EPICS PV backup & restore
client is functionally complete
– Enables manual backup & restore of
set point values
– Automatically backs up set parameters
when DATE signals end of run
• Currently stores values in local XML file archive
– Automatic backup files transferred to CDB via SSH/SCP to heplnw17
– Temporary solution → will be replaced with direct SOAP XML transactions
once RAL networking issues resolved
• Need publicly accessible web server on heplnw17
– Restoration of parameters from CDB will also be implemented once SOAP
transfers are possible
11/04/2016
Imperial College
20
Configuration Database: User Entry
• Not all parameters required for a CDB
‘run’ entry are available through normal
EPICS channels
User Entry Client
CDB Data
Server
(miceiocpc1)
– i.e. Relevant IOCs & integration with the
DAQ are not yet complete
– Currently only beamline magnet currents
can be backed up from ‘live’ servers
• (Quasi) temporary solution:
– Generic EPICS data server hosts PVs for
all values missing from existing IOCs, so
they can be read by backup/restore client
– User entry client allows shifter to enter
required parameters before initiating a run
Backup /
Restore Client
11/04/2016
– As future work progresses, unnecessary
user entry items will be removed
• However, shall always require some
degree of manual data entry
Imperial College
21
Ensuring the Validity of CDB Entries
• Vital that set point values remain fixed during each standard run
– If set point value record in CDB does not represent physical state of system
for entire run, data are invalid
• Implement following protocol to ensure invalid runs are correctly
identified:
– CDB data server hosts run status PV
– User entry client automatically sets run status to true when user submits
current run parameters
• At this stage, users should not modify set point values again until run is complete
– Dedicated monitor IOC checks all set point PVs while DATE is in ‘data
taking’ state → sets run status to false if any value changes (to do)
– Alarm Handler monitors run status → immediately warns that run is invalid
if any user modifies a set point value (to do)
– run status incorporated in CDB run parameters record
11/04/2016
Imperial College
22
Control & Monitoring Systems
C&M Systems Overview
Target: Drive
System
Owner
Paul Smith; Paul Hodgeson; Chris Booth (UOS)
EPICS Developer
Adrian Oates; Graham Cox (DL)
Target: Controller
Paul Smith (UOS); James Leaver (IC)
James Leaver (IC)
Target: Beam Loss
Beamline Magnets
Pion Decay Solenoid
FNAL Beam Position Monitors
TOF
Paul Smith; Paul Hodgeson (UOS); James Leaver (IC)
Martin Hughes (RAL)
Mike Courthold (RAL)
Alan Bross (FNAL)
Maurizio Bonesini (INFN)
Pierrick Hanlet (IIT)
Peter Owens (DL)
Adrian Oates; Graham Cox (DL)
James Leaver (IC)
Pierrick Hanlet (IIT)
CKOV
Lucien Cremaldi; David Sanders (OLEMISS)
Pierrick Hanlet (IIT)
Tracker: Diffuser
Tracker: Spectrometer Solenoids
Wing Lau (OU)
Steve Virostek (LBNL)
Pierrick Hanlet (IIT)
Adrian Oates; Graham Cox (DL)
Tracker: B-Field Probes
Tracker: AFEIIts
Frank Filthaut (RUN)
Alan Bross (FNAL)
Tracker: AFEIIt Infrastructure
Alan Bross (FNAL)
Frank Filthaut (RUN)
James Leaver (IC); Jean-Sebastien
Graulich (UNIGE)
Adrian Oates; Graham Cox (DL)
Calorimeter: KL Calorimeter
Calorimeter: Electron Muon Ranger
Unknown
Unknown
H2 Absorbers: Focus Coils
Virgilio Chimenti (INFN)
A. Blondel; J-S Graulich; V. Verguilov; F. Masciocchi; L.
Nicola; R. Bloch; P. Béné; F. Cadoux (UNIGE)
Wing Lau (OU)
H2 Absorbers: Hydrogen System
Yury Ivanyushenkov; Tom Bradshaw (RAL)
Adrian Oates; Graham Cox (DL)
RF Cavities: Coupling Coils
Derun Li; Steve Virostek (LBNL)
Adrian Oates; Graham Cox (DL)
RF Cavities: RF System
Andy Moss (ASTeC)
DATE Status
Jean-Sebastien Graulich (UNIGE)
Network Status
11/04/2016
DSA Neutron Monitor
Anyone with a PC/IOC in the MLCR/Hall
Andy Nichols; Tim Hayler (RAL)
Imperial
Adrian Oates; Graham Cox (DL)
Dimity Tettyleman (LBNL); Adrian
Oates; Graham Cox (DL)
James Leaver (IC); Jean-Sebastien
Graulich (UNIGE)
James Leaver (IC)
College
Pierrick Hanlet (IIT)
EPICS Status
Operational. Requires additional safety interlocks and
read out of chiller temperature and flow rate.
Software framework and initial versions of EPICS
server/client applications are complete. Low level
hardware drivers are to be implemented once firmware is
in an operational state.
Functionally complete.
Functionally complete.
Functionally complete.
Complete.
EPICS server/client applications for HV power supply
control are functional, but development of hardware
drivers has not yet commenced.
Monitoring of temperature and humidity is functionally
complete. EPICS server/client applications for HV power
supply control are functional, but development of
hardware drivers has not yet commenced.
Not yet commenced.
Included as part of a solenoid package to be provided by
Daresbury Lab. Most of the design drawings are
complete, and ~£15.5K of capital and 0.4 man years of
effort are required to finish the project. Funding has been
confirmed by the US (2/3) and UK (1/3).
Functionally complete.
Complete, but integration with DATE DAQ to be finalised.
Included as part of a solenoid package to be provided by
Daresbury Lab. See Tracker: Spectrometer Solenoids
entry.
Unallocated.
Unallocated.
Included as part of a solenoid package to be provided by
Daresbury Lab. See Tracker: Spectrometer Solenoids
entry.
DL have acquired necessary safety training and have
started evaluating PLC systems. Early stages of
development.
Included as part of a solenoid package to be provided by
Daresbury Lab. See Tracker: Spectrometer Solenoids
entry.
Initial discussions have taken place.
Complete.
Complete.
Functionally complete.
24
C&M Systems Developed by
Local MICE Community
Target: Controller
• Target Controller Stage 1
upgrade underway
– Hardware essentially complete
– Currently working on
Controller firmware (P. Smith)
• Software nearing completion
– Hardware driver framework in
place
– Have implemented all EPICS
server / client functionality
Stage 1 upgrade - Control functionality includes:
- Set delays
- Set / monitor Target depth
- Park / hold, start / stop actuation
- Monitor hardware status
11/04/2016
Imperial College
• Tested using ‘virtual’ Target
Controller Device
– Remaining task: write lowlevel hardware driver plug-in
once firmware is complete
26
Target: Beam Loss
• Standalone DAQ system upgrades:
– Final algorithm selected for ‘absolute’
beam loss calculation
• Thanks to AD for implementation
– Standalone event viewer can now
follow real-time output of DAQ
• Target Beam Loss IOC will read local
data archive written by DAQ & serve
values as PVs
– Enables integration with Alarm
Handler & readout of actuation
numbers for CDB run data entries
– IOC will use same analysis & file
access code as standalone DAQ, via
C wrapper functions
– See PH’s talk
11/04/2016
Imperial College
27
DATE Status
DATE Client
EPICS Data
Server
(Single ‘status’ PV)
• Need mechanism for reporting current DAQ state via EPICS
– Required for user feedback, alarm handling & triggering automatic CDB run
set point value backups
• Simple (‘dumb’) data server hosts DATE status PV
• Client application reads DATE status from DIIM server, forwards value to
EPICS server
• Server & display client complete – DATE integration should be complete
before end of Collaboration Meeting (JSG…?)
11/04/2016
Imperial College
28
Network Status
• Need to verify that all machines on Control & DAQ networks are
functional throughout MICE operation
• Two types of machine
– Generic PC (Linux, Windows)
– Hard IOC (vxWorks)
• EPICS Network Status server contains one status PV for each valid
MICE IP address
• Read status: PC
– SSH into PC (using key files, for security)
• Verifies network connectivity & PC identity
– If successful, check list of currently running processes for required services
• Read status: Hard IOC
– Check that standard internal status PV is accessible, with valid contents
• e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs
11/04/2016
Imperial College
29
Network Status
• Server & client are functionally
complete
– Client displays status of all PCs
& hard IOCs, scans at userspecified period (with ‘check
now’ override)
• Currently not configured for use
on the MICE network…
– Necessary to compile list of
required services for each
MICE PC
– Network specification not yet
finalised
• Need to confirm how status
server will connect to PCs on
both Control & DAQ networks
11/04/2016
Imperial College
30
Other ‘Local’ C&M Systems
• TOF
• CKOV
See PH’s talk
• Diffuser
• DSA Neutron Monitor
• KL Calorimeter
Remain unallocated…
• Electron Muon Ranger
11/04/2016
Imperial College
31
Daresbury C&M Status Update
A. Oates
Recent DL Controls Work
• New ‘IOC controls server’ installed
– Provides boot/configuration parameters for DL hard IOCs (vxWorks):
miceioc1, miceioc2, miceioc4
– Enabled Online Group to take ownership of old ‘IOC controls server’
(miceserv1) & associated Channel Archiver duties
• Provided the Online Group with general EPICS assistance
– EPICS architecture guidance
– Channel Archiver web server installation
• Changed both Target Drive systems to incorporate new spilt rail power
supplies
• Fault diagnosis and correction on Target 2 Drive controls in R78
• Prepared quotation for Cooling Channel Control System
• Completed requested changes to Magnet PSU databases
11/04/2016
Imperial College
33
Schedule & Final Notes
Schedule
11/04/2016
Imperial College
35
Schedule
11/04/2016
Imperial College
36
Final Notes
• Good progress on all fronts
• Groundwork required to unify MICE control systems &
create a user-friendly interface is well underway
– Alarm Handler, Channel Archiver operational
– Unified application launchers complete (bad old days of logging into
individual PCs & typing commands are over!)
– CDB interface functional
• Lots of work still to do!
– Controls effort is ‘significantly understaffed’ (DAQ & Controls
Review, June ‘09)
– We are always operating at full capacity
– ‘Controls & Monitoring’ is such a broad subject that many
unexpected additional requirements crop up all the time…
– Please be patient!
11/04/2016
Imperial College
37