| Literature DB >> 24734019 |
Wojtek J Goscinski1, Paul McIntosh1, Ulrich Felzmann2, Anton Maksimenko2, Christopher J Hall2, Timur Gureyev3, Darren Thompson3, Andrew Janke4, Graham Galloway4, Neil E B Killeen5, Parnesh Raniga6, Owen Kaluza7, Amanda Ng8, Govinda Poudel9, David G Barnes8, Toan Nguyen9, Paul Bonnington1, Gary F Egan9.
Abstract
The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research.Entities:
Keywords: CT reconstruction; Huntington's disease; Quantitative susceptibility mapping; cloud computing; digital atlasing; high performance computing; instrument integration; neuroinformatics infrastructure
Year: 2014 PMID: 24734019 PMCID: PMC3973921 DOI: 10.3389/fninf.2014.00030
Source DB: PubMed Journal: Front Neuroinform ISSN: 1662-5196 Impact factor: 4.081
Figure 1The Australian high performance computing (HPC) environment including peak (national) facilities, specialized national facilities, and local HPC facilities.
Technical specifications of the MASSIVE high performance computing system.
| 42 nodes (504 CPU-cores total) in one configuration: |
| 42 nodes with 12 cores per node running at 2.66 GHz |
| 48 GB RAM per node (2016 GB RAM total) |
| 2 NVIDIA M2070 GPUs with 6GB GDDR5 per node (84 GPUs total) |
| 153 TB of fast access parallel file system |
| 4x QDR Infiniband Interconnect |
| 118 nodes (1720 CPU-cores total) in four configurations: |
| 32 nodes with 12 cores per node running at 2.66 GHz |
| 48 GB RAM per node (1536 GB RAM total) |
| 2 × NVIDIA M2070 GPUs with 6 GB GDDR5 per node (64 GPUs total) |
| 10 nodes with 12 cores per node (visualization/high memory configuration) |
| 192 GB RAM per node (1920 GB RAM total) |
| 2 × NVIDIA M2070Q GPUs with 6 GB GDDR5 per node (20 GPUs total) |
| 56 nodes with 16 cores per node running at 2.66 GHz |
| 64 GB RAM per node (3584 GB RAM total) |
| 2 × NVIDIA K20 (9 nodes—18 GPUs total) |
| 2 × Intel PHI (10 nodes—20 coprocessors total) |
| 20 nodes with 16 cores per node running at 2.66 GHz |
| 128 GB RAM per node (2560 GB RAM total) |
| 2 × NVIDIA K20 (40 GPUs total) |
| 345 TB of fast access parallel file system |
| 4 × QDR Infiniband Interconnect |
| Combined the M1 and M2 have 2,224 CPU-cores. |
Figure 2A schematic of the integration of access to imaging instrumentation from the MASSIVE desktop and the Cloud via the Characterization virtual laboratory.
The computational systems and file system access associated with the imaging instrumentation integrated with MASSIVE and the Characterization Virtual Laboratory.
| Imaging and Medical Beamline | File system integration | GPU processing, parallel FS, and interactive visualization | CT reconstruction and visualization |
| Macromolecular Crystallography Beamline | File system mount | Compute | Structural determination |
| Infrared Beamline | Compute | Signal correction | |
| X-ray Fluorescence Microprobe Beamline | Parallel FS and interactive visualization | Analysis | |
| Small Angle and Wide Angle X-ray Scattering | Compute | Modeling | |
| CT and MRI Imaging Instruments | DaRIS | GPU processing, parallel FS, and interactive visualization | Data capture, analysis, and visualization |
| Electron Microscopes | Tardis | Parallel FS, cloud computing, and interactive visualization | Data capture, analysis, and visualization |
| Biomedical X-ray sources | File system mount | GPU processing, parallel FS, and interactive visualization | CT reconstruction and visualization |
| Atom Probes | Tardis | Cloud computing and interactive visualization | Analysis |
| Electron Microscopes | Tardis | GPU processing, parallel FS, and interactive visualization | Structural determination and visualization |
| Micro-CT X-ray sources | File system mount | CT reconstruction and visualization | |
| Soft X-ray Beamline | CT reconstruction | ||
Figure 3Schematic of the neuroscience image data flow from Monash Biomedical Imaging and the computational processing performed on M2.
Figure 4Schematic of the architecture of the IMBL CT Reconstruction service provided on M1.
Figure 5The total reconstruction time for CT reconstruction of an 8912.
Figure 6The MASSIVE Desktop environment showing FSLView and a range of neuroinformatics tools available through the menu.