Use Case fitting atomic structures in Cryo-EM density - Indico
Download
Report
Transcript Use Case fitting atomic structures in Cryo-EM density - Indico
GPGPU use cases from the
MoBrain community
João Rodrigues
Postdoctoral Researcher
Utrecht University, NL
[email protected]
www.egi.eu
EGI-Engage is co-funded by the Horizon 2020 Framework Programme
of the European Union under grant number 654142
MoBrain main activities
•
•
•
•
•
•
Task 1: User support and training
Task 2: Cryo-EM in the cloud: bringing clouds to the data
Task 3: Increasing the throughput efficiency of WeNMR portals via
DIRAC4EGI
Task 4: Cloud VMs for structural biology
Task 5: GPU portals for biomolecular simulations
Task 6: Integrating the micro (WeNMR/INSTRUCT) and macroscopic
(NeuGRID4you) VRCs
www.egi.eu
Software our solutions
Powerfit
Fitting atomic structures in Cryo-EM density maps using a full exhaustive 6D crosscorrelation search based on FFT techniques.
DisVis
Visualization and quantification of accessible interaction space of distance-restrained
protein-protein docking based on FFT techniques.
GROMACS
Versatile package to perform Molecular Dynamics simulations on systems with hundreds
to millions of particles.
AMBER
Package to perform Molecular Dynamics simulations.
www.egi.eu
Use Case fitting atomic structures
in Cryo-EM density maps
Powerfit
Fitting atomic structures in Cryo-EM density maps using a full exhaustive 6D crosscorrelation search based on FFT techniques.
www.egi.eu
Software powerfit & disvis
Numpy
Cytho
n
Scipy
https://github.com/haddocking/
www.egi.eu
Core
Dependencies
Software powerfit & disvis
Numpy
Cytho
n
Scipy
FFTW3
pyFFT
W
https://github.com/haddocking/
www.egi.eu
Accelerated
CPU
Software powerfit & disvis
OpenC
L
clFFT
GPGPU
Acceleration
pyOpe
nCL
Numpy
Scipy
gpyFF
T
FFTW3
Cytho
n
pyFFT
W
https://github.com/haddocking/
www.egi.eu
Software powerfit & disvis
https://github.com/haddocking/
www.egi.eu
Use Case MD simulation of a large
protein system
Ferritin is a protein of 450 kDa, consisting
of 24 subunits
A MD simulation in explicit solvent involves:
more than 4000 amino acids
more than 36000 water molecules
Total atoms: 176000
Test Simulations were run using
AMBER 14, with OpenMPI
Performance on 2 GPU K20m: 8.66 ns/day
www.egi.eu
Software GROMACS & AMBER
ns / day
1000
800
CUDA
4.x
600
400
200
0
i7 3930K
i7 3930K +
GTX680
E5-2690 +
GTX Titan
CC &
CMake
www.egi.eu
MKL
FFTW3
Software GROMACS & AMBER
CUDA
4.x
GBs of data per day per
simulation.
CC &
CMake
www.egi.eu
MKL
FFTW3
Queueing & Middleware
resources & requirements
Example Hardware
Cluster based on 3 Worker Nodes:
2x XEON E5-2620 v2
2x K20m
64 Gb RAM
Total
36 CPU core and 6 GPU 192 Gb RAM
www.egi.eu
Queueing & Middleware
resources & requirements
Middleware Requirements
OpenMP
I
Torque
& Maui
One Job per GPU (AMBER)
CPUs must be powerful to match the GPU
CPU is still doing some work (e.g. bonded interactions)
Discoverable within the e-infrastructure (e.g. jdl requirement)
Preferrably containing GPU type (GTX vs K-series, AMD vs NVIDIA)
AMD GPUs not supported by MD code (yet)
Double-precision only supported by Tesla cards
www.egi.eu
Conclusions & Questions
Scipy
OpenC
L
pyOpe
nCL
Numpy
CUDA
CC &
CMake
clFFT
OpenMP
I
Torque
FFTW3
Cytho
n
gpyFF
T
& Maui
pyFFT
W
MKL
Thank you for your attention
www.egi.eu