SHERIDAN, WYOMING – October 30, 2024 – ASUS, a global technology leader, recently showcased its latest AI supercomputing solutions at the 2024 OCP Global Summit held in San Jose, California, from October 15-17. The company highlighted its commitment to advancing AI and data center technologies with a comprehensive lineup of AI servers powered by the NVIDIA Blackwell platform.
ASUS's Legacy of Collaboration in the Server Industry
ASUS has a long history of collaboration with cloud service providers, dating back to 2008 when it began designing server motherboards for leading cloud data centers. This extensive experience in the server industry has enabled ASUS to develop a deep understanding of the evolving needs of data centers and AI applications.
Accelerating AI Innovation with NVIDIA Blackwell
At the OCP Global Summit, ASUS showcased its solutions built with the NVIDIA Blackwell platform, designed to meet the demands of generative AI and usher in a new era of data center performance.
Introducing ESC AI POD: A Groundbreaking Rack Solution
One of the highlights of ASUS's showcase was the ESC AI POD, a revolutionary rack solution featuring the NVIDIA GB200 NVL72 system. This system combines 36 NVIDIA Grace CPUs and 72 NVIDIA Blackwell GPUs within a rack-scale NVIDIA NVLink domain to function as a single massive GPU.
"ASUS ESC AI POD a heavyweight contender and star product - is a groundbreaking rack solution, featuring the NVIDIA GB200 NVL72 system, which combines 36 NVIDIA Grace CPUs and 72 NVIDIA Blackwell GPUs within a rack-scale NVIDIA NVLink domain to function as a single massive GPU," described ASUS.
The ESC AI POD offers both liquid-to-air and liquid-to-liquid cooling options for optimal efficiency, catering to individual needs and entire data centers. It is engineered to accelerate large language model (LLM) inference, providing real-time performance for resource-intensive applications.
ASUS also provides a suite of software solutions, system verification methodologies, and remote deployment strategies to support the ESC AI POD, accelerating AI development and scalability for data centers of all sizes.
ESC NM2-E1: A Powerful 2U Server for Generative AI
ASUS also showcased the ESC NM2-E1, a 2U server with the NVIDIA GB200 NVL2 platform, purpose-built for generative AI and high-performance computing (HPC). This server delivers high-bandwidth communication and is optimized for the full NVIDIA software stack, providing an exceptional platform for AI developers and researchers.
Optimizing AI Performance with NVIDIA MGX and H200 GPUs
In addition to the ESC AI POD and ESC NM2-E1, ASUS presented the 8000A-E13P, a 4U server supporting up to eight NVIDIA H200 Tensor Core GPUs. This server is fully compliant with the NVIDIA MGX architecture and is designed for rapid deployment in large-scale enterprise AI infrastructures.
The 8000A-E13P enhances east-west traffic and overall system performance with its NVIDIA 2-8-5 topology (CPU-GPU-DPU/NIC) and four high-bandwidth ConnectX-7 or BlueField-3 SuperNICs.
ASUS's participation in the 2024 OCP Global Summit demonstrated its dedication to driving innovation in AI and data center technologies. The company's cutting-edge solutions, powered by the NVIDIA Blackwell platform, are poised to transform the landscape of AI supercomputing.