Argonne National Laboratory and HPE Unveil Testbed Supercomputer

Argonne laboratory’s new system, Polaris, built by HPE, will optimize AI, engineering and scientific projects for the forthcoming Aurora exascale supercomputer.

Argonne laboratory’s new system, Polaris, built by HPE, will optimize AI, engineering and scientific projects for the forthcoming Aurora exascale supercomputer.

Installation of the Polaris supercomputer system at the Argonne Leadership Computing Facility in August 2021. Polaris provides researchers with a new tool to prepare for science in the exascale era, when computers will perform a billion billion calculations per second. Image courtesy of Argonne National Laboratory.


The U.S. Department of Energy’s Argonne National Laboratory and Hewlett Packard Enterprise have unveiled a new testbed supercomputer to prepare workloads for future exascale systems that will deliver up to four times faster performance than Argonne’s current supercomputers, the organizations report.

The new system, which Argonne has named Polaris, will be built by HPE, and hosted and managed by the Argonne Leadership Computing Facility (ALCF), a U.S. DOE Office of Science User Facility. It will enable scientists and developers to test and optimize software codes and applications to tackle a range of artificial intelligence (AI), engineering and scientific projects planned for the forthcoming exascale supercomputer, Aurora, a joint collaboration between Argonne, Intel and HPE.

Polaris is designed with high-performance computing (HPC) and AI solutions to advance investigations into society’s complex issues, from understanding the biology of viruses to revealing the secrets of the universe. It will also augment Argonne’s ongoing efforts and achievements in areas such as clean energy, climate resilience and manufacturing.

In addition, Polaris will help researchers integrate HPC and AI with other experimental facilities, including Argonne’s Advanced Photon Source and the Center for Nanoscale Materials, both DOE Office of Science User Facilities.

“Polaris is well equipped to help move the ALCF into the exascale era of computational science by accelerating the application of AI capabilities to the growing data and simulation demands of our users,” says Michael E. Papka, director at the ALCF. ​“Beyond getting us ready for Aurora, Polaris will further provide a platform to experiment with the integration of supercomputers and large-scale experiment facilities, like the Advanced Photon Source, making HPC available to more scientific communities. Polaris will also provide a broader opportunity to help prototype and test the integration of HPC with real-time experiments and sensor networks.”

Polaris: Argonne’s North Star

Polaris will deliver approximately 44 petaflops of peak double precision performance and nearly 1.4exaflops of theoretical AI performance, which is based on mixed-precision compute capabilities.

It will be built using 280 HPE Apollo Gen10 Plus systems, which are HPC and AI architectures built for the exascale era and customized to include the following end-to-end solutions:

  • compute to improve modeling, simulation and data-intensive workflows using 560 2nd and 3rd Gen AMD EPYC processors;
  • supercharged AI capabilities to support data and image-intensive workloads while optimizing future exascale-level GPU-enabled deployments using 2240 NVIDIA A100 Tensor Core GPUs;
  • addressing demands for higher speed and congestion control for larger data-intensive and AIworkloads with HPE Slingshot, a high-performance Ethernet fabric designed for HPCand AI solutions; and
  • enabling fine-grained centralized monitoring and management for optimal performance with HPE Performance Cluster Manager, a system management software solution.

 ​“As we approach the exascale era, which will power a new age of insight and innovation, high performance computing (HPC) will play a critical role in harnessing data to take on the world’s most pressing challenges. Increasingly, the computational power and scale required to process artificial intelligence and machine learning data sets can only be delivered through HPC systems, and HPEuniquely provides a powerful, software-driven platform capable of tackling complex scientific data and simulations,” says Justin Hotard, senior vice president and general manager, HPC and Mission Critical Solutions at HPE. ​

“The U.S. Department of Energy’s (DOE) Office of Science continues to make tremendous impacts in accelerating scientific and engineering breakthroughs using HPC,” Hotard adds. “Our latest collaboration with the DOE’s Argonne National Laboratory to build and deliver the Polaris testbed supercomputer will further its mission by preparing users for the magnitude of technological advancement that exascale systems will deliver.”

Polaris Prepares Scientists

Initially, Polaris will be dedicated to research teams participating in initiatives such as the DOE’s Exascale Computing Project and the ALCF’s Aurora Early Science Program.

User communities within the DOE’s Exascale Computing Project will also use Polaris for optimizing engineering tasks for Argonne’s Aurora, which includes scaling of combined CPU- and GPU-enabled systems and the complex integration of workflows combining modeling, simulation, AI and other data-intensive components.

The delivery and installation of Polaris is scheduled to begin in August. It will go into use starting early 2022 and will be open to the broader HPC community in spring of 2022 to prepare workloads for the next generation of DOE’s high-performance computing resources.

Sources: Press materials received from the company and additional information gleaned from the company’s website.

More Hewlett Packard Enterprise Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

DE Editors's avatar
DE Editors

DE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via DE-Editors@digitaleng.news.

Follow DE
#25536