OSG is a network of scientific computing resources devoted to furthering scientific discovery. A large portion of this mission consists of processing data from CERN's LHC.
TeraGrid is a network of eleven sites combining their HPC resources to form a high performance grid for open scientific research.
Argonne National Laboratory has a pair of IBM Blue Gene/P supercomputers (Intrepid and Surveyor) as well as a GPU-based
visualization machine (Eureka). Intrepid has 40,960 nodes and ranks #8 on the Top500 list
with 458.61 teraflops performance on the Linpack benchmark and a theoretical peak of 557.06
teraflops. Surveyor has 1,024 nodes with a theoretical peak performance of 13.9 teraflops.
Intrepid's Top 500 page
Compute Resource Information
How to Use
Brookhaven National Laboratory has an IBM Blue Gene/L system named New York Blue/L which ranks #58 on the Top500 list. In
addition to the 18-rack Blue Gene/L system, there is a 2-rack Blue Gene/P (New York Blue/P)
New York Blue/L's Top500 Page
New York Blue/L User Guide
New York Blue/P User Guide
New York Blue/L Software
LBNL has a Cray XT4 supercomputer named Franklin which is ranked #15 on the Top500 list.
Additionally, LBNL has multiple additional clusters plus a Cray XT5 system named Hopper
which is currently under construction.
Franklin's Top500 page
Supercomputing Resources at LBNL
Lawrence Livermore National Laboratory possesses 3 IBM supercomputers that rank in the Top 500: ASC Purple at #66, Dawn at #11,
and Blue Gene/L at #7. LLNL also has 5 Appro supercomputers in the Top 500: Juno at #27,
Hera at #44, Graph at #57, Atlas at #150, and Minos at #272. Additionally, LLNL also
possesses several other clusters that do not rank on the Top 500.
Blue Gene/L's Top 500 Page
Dawn's Top 500 Page
ASC Purple's Top 500 Page
Juno's Top 500 Page
Hera's Top 500 Page
Graph's Top 500 Page
Atlas' Top 500 Page
Minos' Top 500 Page
LLNL Supercomputing Resources Page
LLNL Supercomputing Allocations Page
How to Run Jobs on LLNL Resources
NETL has three clusters, none of which rank in the Top 500. One of the clusters uses 256 Intel
Xeon processors with a gigabit Ethernet interconnect. LINPACK performance is rated at 961
gigaflops, with a theoretical peak of 1,567 gigaflops. The other two clusers use AMD Opteron
processors and have lower performance.
New Brunswick Laboratory does not publicize information on their high performance computing resources.
ORNL is houses multiple powerful supercomputers, five of which are on the Top 500 list. The
computers on the Top 500 list are Jaguar XT5 (#1), Kraken (#3), Jaguar XT4 (#16), Athena
(#30), and an IBM Blue Gene/P (#379). Jaguar XT5 and Kraken are both Cray XT5 systems, while
Jaguar XT4 and Athena are Cray XT4 systems. Kraken and Athena are operated in conjunction
with NICS and the University of Tennesse. In addition to the five Top 500 computers, ORNL
has many smaller high performance computing systems.
Jaguar XT5's Top 500 Page
Kraken's Top 500 Page
Jaguar XT4's Top 500 Page
Athena's Top 500 Page
Blue Gene/P's Top 500 Page
NCCS Computers at ORNL
NICS Computers at ORNL
RESL does not publish any information about their supercomputing resources.
TACC has 3 HPC installations, 2 of which are in the top 500. Ranger is TACC's highest ranking
computer at #9, and is followed by Lonestar at #105. The third system, Stampede, is a 1,736
node Linux cluster. Each node contains two Intel Clovertown quad core processors, which
deliver a peak performance of 16 teraflops. TACC also has several visualization machines.
Ranger's Top 500 Page
Lonestar's Top 500 Page
TACC HPC Resources
Software Available on TACC Computers
OSC has two HPC clusters: Glenn, and OSC BALE. Glenn is ranked at 107 on the Top 500 list,
while OSC BALE does not rank. OSC BALE consists of two subclusters - an eighteen node
visualization cluster and a workstation cluster. The visualization nodes each have two
dual-core AMD Opteron CPUs at 2.6GHz, and two NVIDIA Quadro FX 5600 graphics cards. The
workstation nodes each have a single AMD Athlon X2 4200+ dual-core processor. Both the
visualization and workstation machines use Infiniband as their interconnect.
Glenn's Top 500 Page
IU has two clusters - Big Red and Quarry. Big Red ranks at #452 in the Top 500, while Quarry
does not rank. Quarry consists of 140 IBM HS21 Blade servers with two quad-core Intel Zeon
5335 processors per node. The system uses gigabit Ethernet for its interconnect and delivers
Big Red's Top 500 Page
Big Red's Hardware
Big Red's Software
How to use Big Red
How to use Quarry
Allocations via TeraGrid
LONI brings together several Louisiana universities' HPC resources into one network. The network
includes multiple small clusters as well as Queen Bee, which is ranked #163 on the Top 500.
Queen Bee's Top 500 Page
Accounts with LONI
NCSA is the future home of the Blue Waters supercomputer. Currently, however, it is home to 4
HPC clusters, one of which ranks in the Top 500. The ranking cluster is named Abe and
delivers 62.68 sustained teraflops, which puts it at #73. The other clusters are Lincoln, a
heterogenous cluster with 192 Dell PowerEdge 1950 compute nodes (dual quad-core Intel
Harpertowns per node) and 96 NVIDIA Tesla S1070 accelerators, delivering 47 teraflops;
Cobalt, an SGI Altix system with 1,024 Intel Itanium 2 processors delivering 6.1 teraflops
sustained; and Mercury, an IBM system with 1,774 Intel Itanium 2 processors delivering 7.22
Abe's Top 500 Page
PSC has four HPC systems including two SGI Altix machines and an HP C3000 machine. The SGI
systems are named Pople and Salk, which contain 768 and 144 cores, respectively. The HP
machine has 64 cores and is named Warhol. The final system is a twenty node cluster named
Codon with two 1.4Ghz AMD Opteron processors per node. None of the systems rank in the Top
SDSC has several clusters that are integrated with TeraGrid. However, none of the clusters are
fast enough to rank in the Top 500. The clusters include Triton, a three component resource
comprising compute, data analysis, and storage systems; Dash, a 5.2 teraflop, 68-node system
using two Intel Nehalem quad-core processors per node; and Bebop, a Sun X64 system using 8
quad-core processors dedicated to data analysis and mining.