CERN_Collaboration_Toolsx
Download
Report
Transcript CERN_Collaboration_Toolsx
Thomas Baron
User Communities and Needs
Audioconference
Videoconference
Webcast
AV Recording
Chat
Indico
General Public Information
2300 staff people in 10 departments
>10000 users
2 sites CH/F
257 physical meeting rooms!
Typically in a collaboration ~70-100
(official) meetings per day
“Collaboration is working together to
achieve a goal”
Wikipedia
Sharing Information
Debating
Analysing
Deciding
Voting
…
Physical meeting
E-communication
Remote meetings
…
Building (accelerators, experiments…)
Operating
Discovering (new physics etc.)
…
Remote collaboration
Textual e-communication
○ Synchronous: chat etc.
○ Asynchronous: email, blogs, web etc.
○ Medium: IP
Audio
○ Telephone; point2point, multipoint (audioconferencing) (medium: copper wire)
○ VoIP (medium: IP)
Audiovisual
○ Synchronous: videoconference, webcast
○ Asynchronous: video recordings
○ Medium: IP
Audioconference
Phone conferencing
3-way directly from the traditional desktop
phone
Via the PBX (29 participants max) through
the PSTN phone
Via the CERN standard (for the
organisation), needs booking
Via the Alcatel Audioconference system
Audioconference extended
Web conferencing: sharing data
Chat
webEx
Videoconference
A natural evolution
Visual as opposed to strictly audio
Better presence feeling, clearer understanding (body
language…)
Usually on public IP networks
Efficient if
Good quality (resolution, framerate etc.)
Low latency ( < 250ms ); impact depends on the
interactivity
On any network condition (jitter, packet loss, net
latency)
Excellent audio required (echo cancellation, lip
sync…)
Videoconference
Cost saver (travel, hotel, food…)
Time saver (travel…)
Fast and efficient
Green! (reduces CO2 emissions)
Example: one person from Chicago
needing to travel 5 times per year to
Geneva:
10000kg CO2
93 hrs productive time lost
2500 EUR
Videoconference: Rooms
Since about 10 years
Currently 60 equipped rooms… and
growing!
Industry-standard equipment: H323/SIP
compatible
Videoconference: Rooms
Various sizes and configurations
From 6 people (small meeting rooms) to 100 + people
(amphitheatres)
Homogeneity of interfaces
Tandberg/Cisco equipment
System reliability
Remote Administration/Support
Centralized H.323 endpoints manager: TMS
(Tandberg Management Suite)
○ Usage statistics
○ Diagnostics and Alarms
○ Software update
Videoconference: H323
H.323-based Videoconference
Industry Standard
Point to point (direct connection from one endpoint to the other)
MCU (multipoint control units) – expensive hardware units on
which the H.323 endpoints can connect for meetings with more
than 2 sites
○ Embedded small MCUs in VC codecs (<5 sites)
○ MCU services
EsNet in the USA
RMS in France
DFN in Germany
CERN recently purchased a Tandberg MSE8000 MCU
Additional Features
○ Data sharing (H.239)
Scale:
○ Limited by the number of ports (usual average of 10 to 20 sites
per meeting)
Videoconference: H323
Routing
point to point
Videoconference: H323
Routing
Embedded MCU
Limitations:
-Scale
-Quality
-Compatibility
-Latency
Videoconference: H323
Routing
MCU
Limitations:
- Scale
- Latency: transcoding,
Multiplexing, rate matching
- Topology: single point of
Failure; cascading is difficult
Videoconference: H323
“Telepresence”
Videoconference: H323
Pros:
High quality
High reliability
High compatibility
Cons:
Latency
Topology
Room-based mostly!
Videoconference: EVO
Videoconference: EVO
Desktop videoconference
Java based: multiplatform (win, mac, linux)
Compatible with the H323 world (gateways)
Created by Caltech in the 90s based on CERN
needs
Now 90% of all LHC VC sessions
~ 40 meetings per day
Scale
Hundreds of people connected (peak 800) overall
Peak of >350 people connected to the same
meeting!
Videoconference: EVO
Integrated features:
Chat
Presence
Recording
http://evo.caltech.edu
Videoconference: EVO
Shared files
Desktop
sharing
Recording
Participants
Buddy list
Chat
Whiteboard
Video
window
Videoconference: EVO
Videoconference: EVO
Flexible and extensible routing/topology
Inherited from the research computing grids
Transparent server failure recovery
MonALISA Evo client
Test !
Videoconference: Vidyo
Desktop VC
Ongoing pilot
Potential future alternative for CERN
Globally same routing strategy than
EVO but introduces an interesting tech
change: H.264 SVC (scalable video
coding)
Better resilience to network problems
Latency reduced to the minimum (no
transcoding!)
Videoconference: Vidyo
Demo
Videoconference: Vidyo
Supports
H323 devices
Phones
PCs (mac, win, linux)
Mobile devices (Android, iOS)
Test!
Videoconference: Future
“Cloud”-based services
Videoconference: Use Cases
Remote Operation Centers
Videoconference: Use Cases
Distance Learning
Videoconference: Use Cases
Masterclasses
Videoconference: Use Cases
ATLAS Virtual Visits
Videoconference: Use Cases
Collaboration Meetings
Videoconference: Use Cases
Mobility
Stay connected everywhere with mobile
devices (wifi, 3G)
Airport, planes, trains!
Recruitment interviews
Webcast
Powerful and cost effective alternative to
VC for large audiences and one-way
addresses.
Webcast
Typical audiences (simultaneous
connections):
Around 50 people for seminars
100 for collaboration meetings
Using CERN servers
1500 for AMS live launch
15000 for LHC startup in 2008 (several
hundred thousands unique Ips)
Using streaming
companies
Webcast: Infrastructure
4 Flash Media Servers
1 “origin” (master); 3 “edges” (slaves), load-balanced
12 encoder PCs (= rooms equipped)
LAN
WAN
Edge #1
encoder
Clients
origin
Edge #2
Edge #3
Webcast: Acquisition
Needs same inputs as VC: Reuse!
VGA + Camera
Audio + video
Osprey 240e audio/video
acquisition
VC Device
VGA
Presenter PC
Epiphan DVI2USB
Webcast: Acquisition
Encoding (live+recording): Flash Media
Live Encoder
Added home made web layer for
operation
Webcast: Monitoring
Flash Administration Console
Live statistics
Webcast: Use Cases
LHC First Beam:
Date: 10th September 2008
Duration: 9.5 hours
Audience:
○ ~2-500.000 webcast
○ 2488 TV news reports
Servers: initially CERN then
GroovyGecko (partner)
Location: CERN Control
Center
Communication: Satellite uplink (Eurovision/EBU) +
videoconference (experiments CRs)
View
Webcast: Use Cases
LHC First Physics:
Date: 30th March 2010
Duration: 6.5 hours
Audience:
○ ~700.000 webcast
○ ~800 TV news reports
Servers: GroovyGecko
(partner)
Location: CERN Control
Center + experiments Control
rooms
Communication: Satellite uplink (Eurovision/EBU) + fiber
(experiments CRs)
View
Webcast: Use Cases
AMS Launch :
Date: 16th May 2011
Duration: 1 hour
Audience: 1500/3300
Servers: CERN
Location: CERN A/V Studio and Kennedy Space
Center
Communication: Satellite downlink+uplink
(Eurovision/EBU)
View
Webcast: Use Cases
ATLAS Live
6 permanent TV channels
○ Internal information
○ Outreach
Servers: CERN
Format: web + Android + iPhone
Web Lectures
Video on demand
Live webcast recordings
Publishing system developed in
collaboration with the U. of Michigan
https://micala.cern.ch
Lectures published on CDS
New player under development
Project with iTunesU
Web Lectures
Processes:
Recording 2 videos full frame
○ Camera
○ Slides
Automated change detection on Slides
Manual slides check
Creation of XML file
Transcoding to web formats
Publishing on CDS
Chat
New service for CERN
Synchronous text-based interactions
with presence information
Based on jabber (XMPP compliant) and
Jappix (web client)
Indico: What is it?
Integrated Digital Conference
Data repository:
Long term archival of events related material
(slides, minutes etc…)
Event organisation web app
Live event support tool
Hub for CERN collaboration services
Indico: History
Started as an European Project (2002):
First time used in 2004
In production at CERN:
http://indico.cern.ch
And in > 90 institutions around the
world:
GSI, DESY, Fermilab,…
• Free and Open Source
Indico: History
Stats:
140k events
620k talks
800k files
~10.000 visitors per day
Indico: Features
Indico: Conference Management
Supports the whole event lifecycle
1. Fully customizable web portal
2. Programme definition
3. Call for abstract and reviewing
4. Registration, e-payment, badge creation
5. Agenda creation
6. Submission of slides and papers
7. Paper reviewing
8. Evaluation survey
Ex.: OAI7, iCHEP 2010
Indico: Simple Events
Timetable, material storage and …
Collaboration Services bookings
Ex.: ISEF2011 students at CERN!
Indico: Room Booking
General Public Information
General Public Information
Network of connected information
screens (LCD screen + PC)
One central server
Implemented using the Scala solution
Central management web interface
GPI: Use Cases
Conference Rooms attached screens
4 Panels outside Rooms B, C, D, E
Display the next 4 events in each meeting
room
Ground
Floor
First
Floor
GPI: Use Cases
General Meeting announcement
2 Global Directions Panels
Display events of the day in rooms A, B, C,
D, E and F
Ground
Floor
First
Floor
GPI: Use Cases
CERN Club Interactive information Point
Ground
Floor
First
Floor
GPI: Use Cases
General Information Screens