Ognjen Prnjat, Greek Research and Technology Network
Download
Report
Transcript Ognjen Prnjat, Greek Research and Technology Network
http://www.grnet.gr
“Greek Research and Technology Network: Status
and Updates”
Dr. Ognjen Prnjat
European and Regional eInfrastructure management
Greek Research and Technology Network
on behalf of GRNET Technical department
eAge 2012, Dubai
GRNET mission
•
GRNET is a state-owned company operating under the supervision of
the Ministry of Education (General Secretariat of Research &
Technology); non-profit
•
Main mission is to provide high-quality electronic infrastructure
services to the Greek academic and research institutions:
•
•
National and international connectivity services
Distributed Computing Infrastructure services (computing, storage,
visualisation)
•
Supporting activity is the promotion and dissemination of the use of
ICT in the public and private sector towards an e-Government, eLearning and e-Business environment
•
Main sources of funding are the Operational Programme for the
Information Society, Ministry of Economy and Finance and EC
projects
•
GRNET has been certified by ISO 9001:2000 in project management
Pan-EU e-Infrastructures vision
• The Research Network infrastructure
provides fast interconnection and advanced
services among Research and Education
institutes of different countries’
• The Research Distributed Computing
Infrastructure provides a distributed
environment for sharing computing power,
storage, instruments and databases through
the appropriate software (middleware) in
order to solve complex application problems
• This integrated networking & DCI
environment is called electronic
infrastructure (eInfrastructure) allowing
new methods of global collaborative research
- often referred to as electronic science
(eScience)
• GRNET was one of the first NRENs in Europe
to expand its services to grid and computing
in general; infrastructure-oriented and
application-neutral
e-Science
Collaborations
Distributed Computing
Infrastructure
Network Infrastructure
GRNET main networking tasks
• Interconnects universities, research centers, academic
organizations (>150), primary and secondary schools (15000)
• 500.000 end-users
• Continuously upgrades the national backbone (Dark fiber backbone
(Nx10 Gbps), institutions access (1 || 10 Gbps per institution) and
international backbone (currently at 4*10 Gbps)
• Operates the GR Internet Exchange (GR-IX), peering of Greek
commercial Internet Service Providers at 10Gbps each
• Cooperates with Greek and international research and academic
institutions for the development of innovative networking services
GRNET network evolution:
from GRNET2 to GRNET3
GRNET2:
2,5Gbps leased lambdas, 20M€, 2000-2005
GRNET3:
dark fibre based, 10Gbps capable, 30M€, 2005-2008
GRNET network evolution: GRNET3
• >50 PoPs
• 9000km fibers (IRU)
• MANs Attiki & Thessaloniki
• DF loops 33 cities
Single-mode fiber pair
15-years IRUs
Availability > 99%
1GbE interconnection over DF
to the closest IP router
• 10GbE links for the “Power
Users”
•
•
•
•
6
GRNET network evolution:
from GRNET3 to GRNET4
GRNET4:
Equipment upgrades, 8M€, 2012-2015
• Service Oriented Design
• Optical Services Layer: physical layer
connectivity
• Carrier Services Layer: Carrier Ethernet
(MPLS) interconnection
• IP Services Layer: IP interconnection among
GRNET customers and the rest of Internet
• 40/100Gbps wavelengths based on PM-QPSK
modulation
GEANT
5
GRNET3:
dark fibres, 10Gbps capable, 30M€, 2005-08
Carrier Services
4
IP Services
5
GRIX
5
AMSIX
1
3
2
Client
Pool
Optical Services
HELLASGRID (HG) infrastructure
http://www.hellasgrid.gr/infrastructure
• HG-01 cluster (pilot phase):
•@Demokritos - Athens
64 CPU, 10TB FC SAN, 12TB
Tape Library, gLite middleware
• HG02-HG07 clusters (HG project):
•Athens (NDC/EKT,Min EDU,
IASA), Thessaloniki (AUTH),
Crete (ICS-FORTH).Patras (CTI)
~1200 Cores
~40 TBytes total raw SAN
storage capacity
~80TBytes Tape Library
• 6 extra Sites offered by Greek
Universities/Research institutes
(FORTH-ICS, AUTH,IASA, UoA,
UoI, UOI-HEPLAB, Upatras)
~600 Cores and 200 TB of Storage
• Total funding ~2Me
8
HELLASGRID applications
• HECTOR: Enabling Microarray Experiments over the
Hellenic Grid Infrastructure
• GRISSOM Platform: Grids for In Silico Systems Biology
and Medicine
• Evaluating the impact of climate change on European
air quality
• Density Functional Theory Calculations on Atmospheric
Degradation Reactions of Fluorinated Propenes
• Investigating the nature of explosive percolation
transition
• First-principles studies on traditional and emerging
materials
9
High Performance Computing
• Goal is the development of a national HPC infrastructure
that will join PRACE Tier-1 European infrastructure
• Budget 3.5MEuro
• Procurement and installation of HPC infrastructure
• Operation and provision of support services
• Aiming for at least a ~150 Tflops system
• Hosted in GRNET’s existing Datacenter
• Support for a wide range of scientific disciplines:
• Biomedicince, Bioinformatics, Computational Engineering, Physics,
Meteorology, Climatology, Seismology, Computational Chemistry etc.
• Based on the results of the HellasHPC feasibility study
• National survey among 29 academic and research institutes (2010)
• Collected requirements from 200 scientific applications developed by 162
research teams from various scientific domains
Cloud for R&E: the process
• Vision: flexible, production-quality cloud services for Greek R&E
community
• Rationale:
– Step beyond Grid in terms of flexibility and availability
– Economies of scale for the community; solving understaffing problems,
poor service, low maintenance
– Minimizes the investment in equipment and support contracts
– Know-how from NOC “Service boxes” (Vima service); as well as Grid
• Policy background: existing MoU in place for Grid computing,
expanding for HPC as well
• Strategy:
– Technical workshops and requirements capture meetings
– Gradual offering of services, starting with storage, moving to VM on
demand, IaaS, and then SaaS
– Paving the way to public sector
• Funding: in the context of GRNET4 project, 2.2ME DCs; 4.5ME s/w and
services)
Cloud for R&E: Why?
• STUDENT- It gives me the opportunity to test different kinds of
software on a machine that I no longer need after my work is done
• PROFESSOR - It makes it possible for me to deploy PC labs without
having to worry about specific hardware or physical space. It makes
me capable of providing machines to my students for a scheduled
amount of time. It gives me storage space to upload content and to
share data with my students or access them through my virtual
hardware
• RESEARCHER- It enables me to run experiments in many different
environments and network topologies which I can provision easily,
quickly and dynamically. I can have persistent or volatile machines
according to my needs. I can also upload (besides my files) my own
Images and launch Virtual Machines from them
okeanos service
• ~okeanos is set to deliver Infrastructure
– Compute (Virtual Machines)
– Network (Virtual Networks, private Ethernets (L2) or public IPv4/6)
– and Storage (Virtual Disks)
as a Service
• Alpha2: from March 2012 - 2000 VMs – 1k alpha users
• Beta December 2012
• Target group: GRNET’s customers
– direct: IT depts of connected institutions
– indirect: university students, researchers in academia
• Users manage resources over
– a simple, elegant UI, or
– a REST API, for full programmatic control
okeanos service
http://www.grnet.gr
Virtual
Compute
Machines
Virtual
Network
Ethernets
Virtual
Storage
Disks
Virtual
Security
Firewalls
okeanos service
http://www.grnet.gr
5x
2x
8x
1x
okeanos service
•
•
•
•
Compute: Cyclades
Files: Pithos+
Images: Plankton
Identity: Astakos
• Volumes: Archipelago
• Accounting/Billing: Aquarium
okeanos: Design
• Commercial IaaS vs own IaaS
• Commercial IaaS
–
–
–
–
Amazon EC2 not an end-user service
Need to develop custom UI, AAI layers
Vendor lock-in
Unsuitable for IT depts
• persistent, long-term servers, custom networking
requirements
• Gain know-how, build on own IaaS reuse
for own services
okeanos: Software Stack
REST API
Multiple users,
multiple resources
Multiple VMs
on cluster
Single
VM
Synnefo
Ganeti
KVM
okeanos: Platform design
Web Client
CLI Client
OpenStack Compute API v1.1
GRNET
datacenter
admin@home
GRNET
Proprietary
Synnefo cloud management software
Google Ganeti
KVM
Direct Out of Band Access
Virtual
Hardware
Web Client
2
Debian
user@home
cloud storage - Pithos
• Online storage for all Greek academic and research
community
• 50 or 100 GB/user; files, groups
• Access by web browsers, native test apps, iPhone, Android
• Open source implementation and API (REST)
• v2 of the API, compatible with OpenStack object storage
• Also used in ~oceanos:
– Stores custom and user-created images
– Plankton provides image registry over pithos
~oceanos: stats
• 1000 alpha users
• 2000VMs (windows, ubuntu, debian in order
of usage)
• Storage: 20% disk usage. Average 20G/VM;
• 1 minute approx startup time for VM
• Scalable to thousands (Pithos to 10k)
• Per-VM or per-GB billing to be implemented
Cloud Infrastructure: Data Centers
•
•
•
•
•
•
Data Center 1 (NDC)
Status – Production operation
Location – National Documentation Center, Athens, Greece
Purpose – Central PoP of GRNET, Connection to GEANT, GR-IX, Cloud and Grid services
Number of Racks - 16
Total Power – 110KW with N+1 redundancy
•
•
•
•
•
•
Data Center 2 (MINEDU)
Status – Production operation
Location – Ministry of Education Athens, Greece
Purpose – Cloud/HPC services
Number of Racks - 28
Total Power – 450KW with N+1 redundancy
•
•
•
•
•
•
Data Center 3 (LOUROS)
Status – Design phase
Location – Hydroelectric Plant of Louros
Purpose – GRNET disaster recovery for cloud/HPC services, GREEN Data Center, PUE<1.3
Number of Racks - 14
Total Power – 250KW with N+1 redundancy
Cloud – MINEDU snapshots
• Thank you!