SC2001 Netcommunications

Download Report

Transcript SC2001 Netcommunications

SCinet:
The Annual Convergence of
Advanced Networking and High
Performance Computing
Steve Corbató, Internet2
MasterWorks track
14 November 2001
SC99 GNAP Demo Network
15-18 November, 1999
Portland, Oregon
Outline
•
•
•
•
•
•
•
SCinet
Wide area connectivity
Fiber
Wireless
Infrastructure
Operations, Measurement, & Security
Events
– Xnet, Bandwidth Challenge, SC Global
• Trends
• Q&A
SCinet is 4 networks
• Production commodity network
• Ubiquitous wireless network
• High-performance/availability exhibit floor
network
• Bleeding-edge testbed - Xnet
Scinet is people (and employers)
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Basil Decina
Bill Iles
Bill Kramer
Bill Nickless
Bill Wing
Bob Stevens
Brad Pope
Brent Sweeny
Caren Litvanyi
Chris Wright
Chuck Fisher
Dave Koester
Davey Wheeler
David Mitchell
David Richardson
Debbie Mantano
Dennis Duke
Doug Luce
Doug Nordwall
Eli Dart
Erik Plesset
Gayle Allen
Greg Goddard
Hal Edwards
Hoan Mai
James Patton
Janet Hull
Jeff Carrell
Jeff Mauth
Jerry Sobieski
Jim Rogers
John Dysert
John Jamison (JJ
Jon Dugan
Kevin Oberman Walsh
Kim Anderson
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Linda Winkler
Martin Swany
Marvin Drake
Matt Zekauskas
Paola Grosso
Patrick Dorn
Paul Daspit
Paul Love
Paul Reisinger
Rex Duncan
Rick Bagwell
Rick Mauer
Riki Kurihara
Rob Jaeger
Robert Riehl
Roland Gonzalez
Russ Wolf
Seth Viddal
Stanislav Shalunov
Steve Corbato
Steve Kapp
Steve Shultz
Steve Tenbrink
Thomas Hutton
Tim LeMaster
Tim Toole
Tom Kile
Tom Lehman
Tony Rimovsky
Tracey Wilson
Warren Birch
Will Murray
Derek Gassen
Paul Fernes
Steve Pollock
SC2001 Leadership
• Bill Wing, ORNL – chair
• Jim Rogers, CSC – vice chair
• Dennis Duke, FSU – incoming
chair
• Chuck Fisher, ORNL –
hardware
• Jeff Mauth, PNNL – fiber
• Martin Swany, UTK –
monitoring
• Eli Dart, NERSC – security
• Bill Nickless, ANL – routing
• Tim Toole, SNL – wireless
• David Koester, Mitre – Xnet
• Jon Dugan, NCSA – net mgmt
• Bill Kramer, NERSC –
Bandwidth Challenge
• Greg Goddard, UFl –
monitoring
• Kevin Oberman, LBL – Denver
fiber
• Steve Corbató, Internet2 –
WAN
• Debbie Montano, Qwest –
Denver connectivity
• Linda Winkler, ANL – SC
Global
SCinet Committee process
•
•
•
•
•
•
•
•
•
Conference calls – biweekly  weekly
Planning meetings (x3)
Venue recon trips (fiber, wireless)
Staging (~3 weeks before SCxy)
Build (starts Monday before SCxy)
Booth drops (~36 hours before gala reception)
Operate network for ~6 days
Tear down (starts Thursday 4:01p)
Rest & do day job for four months and then start
again…
Staging
Wide area connectivity
• Denver: 15 Gbps
–
–
–
–
2xOC-48c: Abilene (Denver)
2xGigE: STAR LIGHT (Chicago)
1xOC-48c: Pacific/Northwest Gigapop (Seattle)
2xOC-48c: ESnet (Sunnyvale & Chicago)
• Level(3) provided wide area connectivity
• Qwest provided local dark fiber
WAN Bandwidth trends
•
•
•
•
•
SC98 (Orlando): 200 Mbps
SC99 (Portland): 13 Gbps
SC2000 (Dallas): 10 Gbps
SC2001 (Denver): 15 Gbps
SC2002 (Baltimore): Nx10-Gbps ’s??
• Increasing focus on BW utilization
Abilene & SCxy
Escalating bandwidth
– SC99 Portland: OC-12c SONET (622 Mbps)
– SC2000 Dallas: OC-48c SONET (2.5 Gbps)
– SC2001 Denver: 2xOC-48c SONET (5 Gbps)
SCxy transit connectivity offered to domestic &
international R&E nets
Backbone MTU raised to 9K bytes
Traffic engineering for SC2001
End-to-End Performance: GigaTCP testing
SC2002 Baltimore: 10-Gbps  (planned)
Abilene traffic engineering – SC2002
Fiber (Jeff Mauth)
• 60+ miles of fiber deployed in exhibit hall
– 0.3+ FTE-year
– ~1.5 fiber-miles/hour
• 120 fiber drops (90% multimode)
• Pirelli 24 strand MM fiber used since ’98
• Deployment custom engineered to the venue
selected for SCxy
• ST fiber connectors standard
– Will review choice for SC2002
Fiber timeline – SC2002
•
•
•
•
•
•
5 scouting trips
Tue 11/6 9p – gained access to 2/3 of hall
Thu 11/8 6p – gained access to rest of hall
Fri 11/9 a.m. – fiber done
Sun 11/11 a.m. – equipment patching
Sun 11/12 p.m. – booth drops start
– wireless & HP Jornada
• Mon 11/12 noon – drops complete
• Mon 11/12 7p – gala opening (D-DAY)
• DANGER: carpet layers (20-30 cuts this year)
Wireless (Tom Hutton)
• Significant 802.11b effort this year
• 35 Cisco wireless access points (13 in exhibit hall)
– One on DCC roof pointed at Embassy Suites
• Wireless still requires a lot of wires & work
– 5000’ of wiring in exhibition hall
– Several site surveys over the year
• Totally flat LAN (3.5 Gbps switched BW)
• Wireless really helps show set-up
– Booth drop teams, booth connectivity prior to fiber
• Clients seen: 618 peak, 246 average
Infrastructure (Chuck Fisher)
• SC98
– Core Routing provided by traditional Cisco 7500 series
routers
– First "production" use of gigabit Gigabit Ethernet (only
1 customer drop requested)
– Most booth service was 10Base-FL and 100-FX
provided via Fore Power Hubs
– Limited use of network monitoring and statistics
An earlier topology…
Infrastructure trends - II
• SC99
– Core Routing provided by Cisco GSR series
routers
– Concept of a routing core and a layer of L3
distribution switches adopted
– Extensive use of DWDM hardware to provide
WAN badwidth
– Xnet introduced as a showcase for "bleeding
edge" hardware
Infrastructure trends - III
• SC2000
– Core routing provided by Cisco and Juniper
– Increased focus on network monitoring and
statistics
– First Xnet demonstration of 10 Gigabit Ethernet
– Bandwidth Challenge introduced to SC
SCinet 2001 Network Topology
Infrastructure trends -IV
• SC2001 Contributing Hardware Vendors
–
–
–
–
–
–
–
–
–
Cisco
Juniper
Marconi
Nortel
Spirent
Force10
Foundry
ONI
LuxN
• Equivalent to 3-5 bldg advanced campus network
on major R&E backbones
Operations
• Servers
– DNS, DHCP, NTP, Performance, beacons
•
•
•
•
•
Database
Network monitoring
Help desk
Trouble ticket system
Routing support (unicast, multicast, v6)
Measurement and Security
Security monitoring
Local Nimbda infections
2
Clear Text Root Logins
68
Clear Text Passwords
1483
Code Red 1 Infections
1
External Nimbda sources
857
Passwordless accounts
59
Scans
137
Xnet
TeraGrid Distributed Backplane NCSA, ANL, SDSC, Caltech
StarLight
International Optical Peering Point
(see www.startap.net)
Abilene
Chicago
Indianapolis
Urbana
Los Angeles
San Diego
OC-48 (2.5 Gb/s, Abilene)
Multiple 10 GbE (Qwest)
Multiple 10 GbE (I-WIRE Dark Fiber)
• Solid lines in place and/or available by October 2001
• Dashed I-WIRE lines planned for summer 2002
UIC
I-WIRE
Starlight / NW Univ
Multiple Carrier Hubs
Ill Inst of Tech
ANL
Univ of Chicago
NCSA/UIUC
Source: Charlie Catlett, Argonne
Indianapolis
(Abilene NOC)
Xnet
Trends
… or what we might see in
Baltimore?
Optical networking
•Dense Wave Division Multiplexing (DWDM)
–Current systems can support >160 10-Gbps ’s (1.6
Tbps!)
–Optical growth can overwhelm Moore’s Law (routers)
•Costs scale dramatically with distance
•Three possible scenarios for the future
–Enhanced IP transport (higher BW and circuit
multiplicity)
–Fine-grained traffic engineering
• p2p links between campuses, HPC centers, & Gigapops
–Physical e2e switched circuits (a la ATM SVCs)
•Evolution of optical switching will be critical
–Don’t write off OEO
Future of Abilene
• Extension of Qwest’s original commitment to
Abilene for another 5 years – 10/01/2006
–Originally expired March, 2003
• Upgrade of Abilene backbone to optical transport
capability - ’s
–x4 increase in the core backbone bandwidth
• OC-48c SONET (2.5 Gbps) to 10-Gbps DWDM
–Capability for flexible provisioning of 10-Gbps ’s to
support future point-to-point experimentation & other
projects
• Emphasis on v6, network measurement, &
measurement capabilities
SC2002/Baltimore crystal ball
• Strong local networking community
– MAX Gigapop (University of Maryland)
– DARPA Supernet (ISI-East, NRL)
• Dark fiber & network presences in region
• Abilene is aiming for 10-Gbps  connectivity
• Increased focus on e2e performance & multicast
reliability
• More wireless (add 802.11a); less ATM?
• 10 Gigabit Ethernet should be standardized
• Optical switch in Xnet?
Conclusion
• Scinet is…
… a diverse group of very committed and
talented people and companies working very
hard under extreme time constraints and trying
conditions to make both the expected and the
new and impossible in SCxy networking
happen for one week in November and then
return to do it again the next year