PosterPDF Available

Digital Science Center I

Authors:

Abstract and Figures

This poster introduces all of DSC projects below and covers 1) 3) 4) 5) 1) Digital Science Center Facilities 2) RaPyDLI Deep Learning Environment 3) SPIDAL Scalable Data Analytics Library and applications including Bioinformatics and Polar Remote Sensing Data Analysis 4) MIDAS Big Data Software; Harp for HPC-ABDS 5) Big Data Ogres Classification and Big Data Analytics Performance including communication, VM and Java overhead 6) CloudIOT Internet of Things Environment 7) Cloudmesh Cloud and Bare metal Automation and NIST use cases 8) XSEDE TAS Monitoring citations and system metrics 9) Visualization WebPlotviz
No caption available
… 
Content may be subject to copyright.
Geoffrey C. Fox, David Crandall, Judy Qiu, Gregor von Laszewski, Fugang Wang, Badi' Abdul-Wahid
Saliya Ekanayake, Supun Kamburugamuva, Jerome Mitchell, Bingjing Zhang, Pulasthi Wickramasinghe, Hyungro Lee, Andrew Younge
School of Informatics and Computing, Indiana University
Digital Science Center I
Ice Layer Detection Algorithm
The polar science community has built radars capable of surveying the
polar ice sheets, and as aresult, have collected terabytes of data and
is increasing its repository each year as signal processing techniques
improve and the cost of hard drives decrease enabling anew-
generation of high resolution ice thickness and accumulation maps.
Manually extracting layers from an enormous corpus of ice thickness
and accumulation data is time-consuming and requires sparse hand-
selection, so developing image processing techniques to automatically
aid in the discov ery of knowledge is of high importance.
DA-MDS speedup for 200K with
different optimization techniques Java K-Means 1 mil points and 1k centers
performance on 16 nodes for LRT-FJ and
LRT-BSP with varying affinity patterns
over varying threads and processes.
High Performance Molecular
Dynamics in Cloud Infrastructure
Large potential for running MD simulations in
virtualized infrastructure using KVM hypervisor
Advanced HW: InfiniBand and GPUs in VMs
LAMMPS MD 1.9% overhead = near native
KVM can outperform native with Huge Pages
Building scalable High Performance Virtual Clusters
Polar Remote Sensing Algorithms Original LDA (orange)
compared to LDA exploiting
sparseness (blue)
Note data analytics making
use of Inf iniband (i.e. limited
by communication!)
Java code runni ng under
Harp Hadoop plus HPC
plugin
Corpus: 3,775,554 Wi kipedia
documents, Vocabulary: 1
million words; Topics: 10k
topics;
BR II is Big Red II
supercomputer with Cray
Gemini interconnect
Juliet is Haswell Cluster with
Intel (switch) and Mellanox
(node) infini band (not
optimized)
Parallel Sparse LDA using Harp
Harp LDA on Juliet (36 core nodes)
Best MPI; inter
and intra node
Digital Science Center Research Areas
Digital Science Center Facilities
RaPyDLI Deep Learning Environment
SPIDAL Scalable Data Analytics Library and
applications including Bioinformatics and Polar
Remote Sensing Data Analysis
MIDAS Big Data Software; Harp for HPC-ABDS
Big Data Ogres Classification and Big Data
Analytics Performance including
communication, VM and Java overhead
CloudIOT Internet of Things Environment
Cloudmesh Cloud and Bare metal Automation
and NIST use cases
XSEDE TAS Monitoring citations and system
metrics
Visualization WebPlotviz
Name Sy ste m
Type #
Nodes
#
CPUs
#
Cores
RAM
(GB) Storage
(TB)
India
IBM
iDataPlex
128 256
1024
3072 335
Bravo
HP Proliant
16 32 128 3072 128
Delta
SuperMic ro
GPU Clus ter
16
32+32
GPU
192 1333 144
Echo
SuperMic ro
Cluster
16 32 192 6144 192
Madrid
Dell HPC
Cluster
832 128 384
28.8 HDD;
6.8 SDD;
Tempest
HP Proliant
HPC Custer
32 128 768 1536 25
Juliet
SuperMicro
HPC Cluster
128 256
3456
16384
1024 HDD;
50 SSD
Romeo
SuperMic ro
GPU Clus ter
4
8+16
GPU
96 512 32
ResearchGate has not been able to resolve any citations for this publication.
ResearchGate has not been able to resolve any references for this publication.