PSC - University of Pittsburgh

Download Report

Transcript PSC - University of Pittsburgh

Web interface for large scale neural circuit
reconstruction processes for connectomics
Art Wetzel - Pittsburgh Supercomputing Center
National Resource for Biomedical Supercomputing
[email protected] 412-268-3912
www.psc.edu and www.nrbsc.org
Source data from …
R. Clay Reid, Jeff Lichtman, Wei-Chung Allen Lee
Harvard Medical School, Allen Institute for Brain Science
Center for Brain Science, Harvard University
Davi Bock
HMMI Janelia Farm
David Hall and Scott Emmons
Albert Einstein College of Medicine
Aug 30, 2012 Comp Sci Connectomics Data Project Overview
1
What is Connectomics?
“an emerging field defined by high-throughput
generation of data about neural connectivity, and
subsequent mining of that data for knowledge about the
brain. A connectome is a summary of the structure of a
neural network, an annotated list of all synaptic
connections between the neurons inside a brain or brain
region.”
DTI “tractography” Human
Connectome Project at
MRI 2 mm resolution
~10 MB/volume
1.3x106 mm3
“Brainbow” stained
neuropil at 300 nm optical
resolution
~10 GB/mm3
Serial section electron
microscopy reconstruction
at 3-4 nm resolution
~1 PB/mm3
2
Reconstructing brain circuits requires
high resolution electron microscopy
over “long” distances == BIGDATA
Vesicles ~30 nm diam.
A synaptic junction
>500 nm wide with
cleft gap ~20 nm
Dendritic spine
Dendrite
www.coolschool.ca/lor/BI12/unit12/U12L04.htm
Recent ICs have 32nm features
22nm chips are being delivered.
Gate oxide 1.2nm thick
3
Current data from a 400 micron
cube is greater than 100 TBs (.1 PB)
A full mouse brain would be an exabyte == 1000 PB
4
Rigid alignment does not permit
visualization of 3D structures
Data courtesy of Richard Fetter (UCSF)
5
Non-rigid deformable registration
produces useful out of plane views
Data courtesy of Richard Fetter (UCSF)
6
C&S P10: Advancing high-throughput thin-section scanning EM to study
relationships between neuronal circuit structure and function.
Jeff Lichtman’s team at Harvard is developing improved methods for sample handling and very high speed scanning electron
Microscopy to enable studies of large regions of brain tissue from individual specimens. We have worked closely with
Lichtman’s team as they have captured a leading edge dataset with a tissue volume of 400x400x300 microns. The resulting
100 TByte image set is being registered as a test case of our new Signal Whitening Fourier Transform alignment method.
The left image above shows an aligned and partially segmented view of a low resolution prescan of the entire 1mm wide
10,000 section specimen. This was used to select a region of interest for high resolution imaging at 4nm/pixel. The right
image is a greatly reduced, ~1/200th scale, overview through the 100 TB high resolution dataset showing the smoothness
and consistency of the capillary network as viewed and segmented using our PSC Volume Browser. We are continuing the
full resolution alignment that is needed prior to the detailed circuit tracing of connections between the ~30,000 neurons
within the ROI. Due to the very large storage requirements this dataset will also be the first large scale test of our Virtual
Volume FileSystem mechanism to provide aligned views rendered on demand from original data without requiring duplicate
data storage. Our methods for large scale registration and data handling will be increasingly important as Lichtman’s team
installs a new parallel beam SEM that will produce 1 Gbyte/sec within the next year.
7
The CS project is build a web based UI to
submit, monitor, steer and evaluate compute
tasks for EM based reconstructions.





We already have command based programs to do the
processing on PSC compute cluster and storage facilities.
Biologists who capture the raw data at distant sites need
a friendly and portable interface to transfer datasets,
enter notes, automatically initiate compute jobs, track
progress, etc.
We will provide PSC computing accounts and office
space to work with PSC staff and other students working
on different aspects of our connectomics projects.
You will gain experience with large scale data handling
and computer operations at a major supercomputing site.
Valuable background includes web development skills,
basic computer graphics, a multidisciplinary approach to
8
problem solving and an interest in computational biology.