The acceleration of AI and high-performance computing is changing the landscape of the data center cooling industry. Next-generation technologies, including NVIDIA GPUs, require liquid cooling. This technology is here to stay and expanding every year; data center managers need to think about how, not if, they will bring liquid cooling capabilities into their data centers.
Liquid cooling uses chilled water instead of air to capture and transport heat away from chips. It can offer better performance while saving energy and helping data centers operate more sustainably.
Liquid provides a much greater heat transfer capacity than air, which helps increase power usage effectiveness (PUE), reducing energy costs and contributing to environmental sustainability. Liquid can also be brought closer to the source of heat (the chip) than air, further increasing efficiency.
Liquid cooling can also help data centers increase capacity within their existing footprint and offer a favorable return on investment for data center facilities. Liquid cooling systems provide an effective solution for achieving the required temperature parameters and reducing the energy consumption of cooling systems while increasing chip density.
To build a properly equipped liquid cooled infrastructure, many different technologies need to work together in concert. Here are three important ones:
Coolant Distribution Units
Cooling distribution units (CDUs) are the heart and brain of the liquid cooled data center, pumping chilled liquid through racks at the optimal rate and temperature to maximize cooling. CDUs use advanced control algorithms to use energy as efficiently as possible while keeping IT at appropriate operating temperatures. CDUs can be a standalone piece of equipment that control liquid flow through other cooling infrastructure, or they can be integrated into other equipment. However, even when a CDU is not explicitly used in a liquid cooling system, CDU technology is always used to control liquid flow in a cooling system.
There are many considerations for selecting the right CDU, from the kinds of pumps it uses, to how it maintains water quality, to how it integrates with the system as a whole, but making the right choice is critical. nVent’s RackChiller CDU800 system is focused on providing the highest reliability, availability and serviceability for supporting direct-to-chip liquid cooling. The CDU800 is fed from a primary facility water system, where the integrated pumps drive the secondary technology cooling system (TCS) cooling loop flow. The heat exchanger inside the CDU transfers excess heat from the secondary coolant to the primary. The entire system is integrated into an enclosure serviceable from the front and rear doors, allowing for easy serviceability and customizability. Designers can also link CDUs to a rear door cooler to create a separation of the chiller, dry cooler or cooling towers.
Rear Door Heat Exchangers
Rear door heat exchangers (RDHx) uses liquid cooling technology in combination with existing air cooling infrastructure to create an efficiently cooled environment at the server rack level rather than cooling an entire room. This allows data centers that are still running chips that can be cooled with air to use liquid cooling to improve capacity and energy efficiency.
nVent’s RDHX Pro is efficient, scalable and easy to service. Because data centers are regulating IT heat with cooling at the rack level instead of the room level, data center managers can install more RDHX units where they need them, cooling only the areas where they have active IT instead of cooling an entire room wherever IT is present. The fans on the door, the touch screen control and the power supply units all can be swapped out without turning off the unit, which is extremely important for an industry where 24/7 uptime is the standard.
Liquid Manifolds
While they may not contain advanced algorithms or programming, manifolds are absolutely critical to the success of cooling systems. Manifolds deliver coolant from CDU units or other equipment to cooling equipment within the rack. Because manifolds transport liquid in such close proximity to IT, it is critical that all connections are dripless and work with all equipment. If manifolds fail, the entire system will be compromised.
nVent’s RackChiller manifolds are configurable and designed to be utilized in liquid cooled applications that require increased uptime and improved power usage effectiveness. Designed with integration and installation in mind, the manifolds utilize plug-and-play style connections, universal mounting features and coexist with other data center accessory equipment.