Center for Financial Innovative Technology
PI: John Wu, Marcos Lopez de Prado
Computational Research Division
DIRAC1
Dell PowerEdge C6320 cluster
128 dual-socket, 12-core Intel Haswell processor nodes
32,768 GB aggregate memory
Mellanox FDR Infiniband interconnect
113 TF (theoretical peak)
|
Molecular Foundry Theory Group
PI: David Prendergast
Molecular Foundry Division
ETNA
Finetec Supermicro cluster
175 dual-socket, 12-core Intel E5-2670v3 2.3Ghz Haswell processor nodes 9 Dual K-80 GPU nodes 11,200 GB aggregate memory
Mellanox FDR Infiniband interconnect
155 TF (theoretical peak)
|
Joint Center for Artificial Photosynthesis
PI: Peidong Yang, Martin Head-Gordon, Phillip Geissler
JCAP
Finetec Supermicro Cluster
46 quad-socket, 16-core AMD 2.3 Ghz Interlagos processor nodes
8.4 TB aggregate memory
27 TF (theoretical peak) |
Molecular Foundry Theory Group
PI David Prendergast
Molecular Foundry Division
VULCAN
Dell PowerEdge R610 Cluster
242 dual-socket, quad-core Intel 2.4Ghz Nehalem processor nodes
5808GB aggregate memory
48TB Bluearc NFS storage
60TB DDN S2A6620 Lustre storage
Voltaire QDR Infiniband
18.5 TF (theoretical peak)
|
Molecular Foundry Theory Group
PI David Prendergast
Molecular Foundry Division
NANO1
IBM iDataPlex Cluster
80 dual-socket, quad-core Intel 2.66 Ghz Nehalem processor nodes
1920 GB aggregate memory
Voltaire QDR Infiniband
6.8 TF (theoretical peak) |
ALS-U Accelerator Physics Group
PI David Robin
Advanced Light Source Division
ALSACC
Dell C6100, C6220, C6220 II, SYS 6028TP-HTFR SuperServer, SYS-6029TP-HTR SuperServer 28 dual-socket, hex-core Intel 2.67Ghz Westmere processor nodes 16 dual-socket 8-core, Intel 2.60Ghz Sandybridge processor nodes
12 dual socket 10-core, Intel 2.5Ghz Ivybridge processor nodes
8 dual-socket, 12-core Intel
2.3Ghz Haswell processor nodes 4 dual-socket, 32-core, Intel Xeon Gold 5218 2.30GHz processor nodes 4.60 TB aggregate memory
Mellanox Infiniband (QDR, FDR, EDR) 17.47 TF (theoretical peak) |
David M. Romps Earth Sciences Division CUMULUS Dell C6100 Lawrencium LR2 Condo 28 dual-socket, hex-core Intel Westmere 2.66Ghz processor nodes 672GB aggregate memory 24 TB Bluearc disk storage Mellanox QDR Infiniband 3.6TF (theoretical peak)
|
|
5 dual-socket,16-core Intel nodes
1 dual-socket 44-core Intel node
1.6TB aggregate memory
16 nVidia TITAN (43,008 gpu cores)
4 nVidia K80 (19,968 gpu cores)
4 nVidia TITAN X (15,416 gpu cores)
TOTAL 78,392 GPU cores 260TB shared disk storage
Mellanox FDR Infiniband 2.94 GPU PFLOPS (theoretical peak)
|
Ernest L. Majer
Greg Newman
Earth Sciences Division
VOLTAIRE
Dell PowerEdge C6100 Cluster
44 dual-socket, hex-core Intel 2.66Ghz Westmere processor nodes
1056 GB aggregate memory
24 TB Bluearc disk storaage
Mellanox QDR Infiniband
5.6 TF (theoretical peak)
|
Eric Sonnenthal
Earth Sciences Division
EXPLORER
Finetec SuperMicro 2U Twin2 Cluster
8 dual-socket, quad-core 3.3Ghz Intel E5-2643 SandyBridge processor nodes
192 GB aggregate memory
Mellanox FDR-14 Infiniband
1.6 TF (theoretical peak) |
Kai Vetter
Brian Quiter
Nuclear Sciences Division
ARES
Dell PowerEdge R410 cluster
21 dual-socket quad-core Intel 2.4Ghz Westmere processor nodes
336GB aggregate memory
2.4TF (theoretical peak)
|
Andrew Canning
Computational Research Division
BALDUR1
IBM iDataPlex Cluster
40 dual-socket, dual-core Intel 2.66Ghz Nehalem processor nodes
960 GB aggregate memory
Voltaire QDR Infiniband
3.4 TF (theoretical peak) |
Martin Head-Gordon
Chemical Sciences Division
MHG
21 AMD Opteron 246 2.0Ghz single-core processor nodes
8 AMD Opteron 270 2.0Ghz dual-core processor nodes
272 GB aggregate memory
1.5 TB shared disk storage
296 GFLOPS (theoretical peak) |
Paul D. Adams
Joint Bioenergy Institute
JBEI1
IBM iDataPlex Cluster
120 dual-socket, dual-core Intel 2.66Ghz Nehalem processor nodes
2880 GB aggregate memory
Voltaire QDR Infiniband
10.2 TF (theoretical peak)
|
Material Science Division
CATAMOUNT
Dell PowerEdge C6220 Cluster
114 dual-socket, 8-core Intel 2.6Ghz SandyBridge processor nodes
7296 GB aggregate memory
24 TB Bluearc disk storaage
Mellanox FDR Infiniband interconnect
38TF (theoretical peak)
|
Nobumichi Tamura
Advanced Light Source Division
XMAS Xray Microdiffraction User Facility
Dell PowerEdge C6320 cluster
20 dual-socket, 14-core Intel Broadwell processor nodes
1.25 TB aggregate memory
Mellanox FDR Infiniband interconnect 21.50 TF (theoretical peak) |
Steven G. Louie
Marvin L. Cohen
Material Sciences Division
HBAR1
IBM iDataPlex dx360m4 Cluster
80 dual-socket, quad-core Intel 2.66 Ghz Nehalem processor nodes
1920 GB aggregate memory
Voltaire QDR Infiniband
6.8 TF (theoretical peak) |
|
UC Berkeley PI-owned Clusters
|
Bruno Olshausen
UCB Helen Wills Neuroscience Institute & School of Optometry, and Redwood Center for Theoretical Neuroscience
Dell PowerEdge R210 Cluster
cortex.berkeley.edu
10 Intel E3-1220v2 3.10Ghz processor nodes
2 ea. Nvidia M2050 GPU nodes
160 GB aggregate memory
Gigabit Ethernet interconnect |
Donna Hendrix
UC Berkeley QB3 Computational Genomics Research Lab
vector.berkeley.edu
5 dual-socket, hex-core 2.66 Ghz Intel Westmere processor nodes (60 cores)
240 GB aggregate memory
24 TB Bluearc Mercury 55 storage
Gigabit Ethernet
638 GFLOPS (theoretical peak) |
|
|
|
|
|
|
|
|
|
|