NF5488LA5, boasting high-efficiency liquid-cooling, ranks No.1 in 11 of the 16 tests in the closed data center division of the 2021 MLPerf™ Inference v1.0 Benchmark.
Today at ISC High Performance 2021 Digital, the event for high performance computing, machine learning, and data analytics, Inspur Information, a leading IT infrastructure solution provider, announces its new liquid cooling AI server, NF5488LA5. Designed with a liquid cold plate and a maximum capability of supporting eight high-speed and interconnected NVIDIA® A100 Tensor Core GPUs via NVSwitch, this new offering is ideal for customers who need a high-performance and energy-efficient AI server.
Designed to meet the energy-saving needs required by High-Performance Computing (HPC) and Artificial Intelligence (AI), the new NF5488LA5 is an update on Inspur’s leading AI server NF5488A5, but now boasts liquid-cooling technology and supports the latest NVIDIA A100 Tensor Core GPU.
NF5488LA5 is equipped with two AMD EYPC 7003 series processors and eight NVIDIA A100 Tensor GPUs in a 4U chassis fully connected by NVSwitch. The GPU-to-GPU communication bandwidth reaches 600GB/s, thus enabling lower latency. The system topology adopts an ultra-low latency design to maximize the communication performance between the processor and the AI accelerator. With immensely improved cooling efficiency enabled by industry-leading warm-water cooling technology, the new server meets the extreme computing needs in science, simulation, and AI.
The liquid cold plate on the NF5488LA5 covers CPUs, GPUs and NVSwitches. The Liquid cooling power consumption accounts for 80% of the total consumption, effectively reducing Power Usage Effectiveness (PUE) to 1.1. The GPU cold plate is meticulously designed with a parallel connection of four water loops, which enables the liquid to flow through the surface of GPU and NVSwitch consecutively for high-efficiency cooling of the server component that generates the most heat. High-efficiency liquid-cooling is among the major reasons that NF5488LA5 ranks No.1 in 11 of the 16 tests in the closed data center division of the 2021 MLPerf™ Inference V1.0 Benchmark. It is also the only GPU server submitted that ran the NVIDIA A100 GPU at 500W TDP via liquid cooling technology.
Deployment-wise, Inspur NF5488LA5 can be connected to a mobile Coolant Distribution Unit (CDU). After connecting it to the RACKCDU-F008 mobile liquid-cooling CDU with quick release connectors, customers can place the units directly in the general air-cooling cabinet, without the need to set up primary side cooling units or rearranging the entire cooling system in the server room. The scaling-up of liquid cooling servers can be done by stacking such units inside the cabinet. The innovation solves the long-standing problem faced by liquid cooling servers in terms of deployment and scalability.
For more such updates and perspectives around Digital Innovation, IoT, Data Infrastructure, AI & Cybsercurity, go to AI-Techpark.com.