ADVERTISEMENT

HPE Announces 100% Fanless Direct Liquid Cooling Systems Architecture For Large AI Workloads

The architecture helps reduce utility costs, carbon production and data centre fan noise.

<div class="paragraphs"><p>HPE has announced a 100% fanless direct liquid cooling systems for large-scale AI deployments. (Source: Hewlett Packard Enterprise)</p></div>
HPE has announced a 100% fanless direct liquid cooling systems for large-scale AI deployments. (Source: Hewlett Packard Enterprise)

Hewlett Packard Enterprise Co. has announced a 100% fanless direct liquid cooling systems architecture to enhance the energy and cost efficiency of large-scale artificial intelligence deployments.

The architecture reduces cooling power required per server blade by around 37% when compared to hybrid direct liquid cooling, HPE said. This reduces utility costs, carbon production, and data centre fan noise. In addition, because systems using this architecture can support greater server cabinet density, they consume half the floor space.

While efficiency has improved in next-generation accelerators, power consumption is continuing to intensify with AI adoption, outstripping traditional cooling techniques. Organisations running large AI workloads will need to do so more efficiently.

One of the most effective ways to cool AI systems is through direct liquid cooling. This cooling technology has enabled HPE’s systems to deliver seven of ten supercomputers on the Green500 list, which ranks the world’s most energy-efficient supercomputers.

The new architecture has an 8-element cooling design that includes liquid cooling for the graphics processing unit, central processing unit, full server blade, local storage, network fabric, rack/cabinet, pod/cluster, and coolant distribution unit. The network fabric design allows integration at a large scale, and its open system offers flexibility of choice in accelerators. The high density system design is supported by monitoring software and on-site services.

“As organisations embrace the possibilities created by generative AI, they also must advance sustainability goals, combat escalating power requirements, and lower operational costs,” said Antonio Neri, president and CEO of HPE. “The architecture we unveiled today uses only liquid cooling, delivering greater energy and cost-efficiency advantages than the alternative solutions on the market.”