The increasing power consumption and heat generation of processors and other datacenter equipment have brought liquid cooling into the spotlight. The growing interest in this technology is further evidenced by recent investments made in the field.
One notable development is the acquisition of CoolIT Systems, a long-standing player in the liquid cooling market, by global investment company KKR. The deal, reportedly valued at $270 million, is aimed at enabling CoolIT to expand its operations and serve a wider range of global customers in the datacenter market. This market encompasses enterprise, high-performance computing (HPC), and cloud service provider segments.
KKR’s investment in CoolIT indicates its anticipation of a profitable return. However, their statements regarding the acquisition also reflect a recognition of the challenges facing the datacenter industry in terms of sustainability. Kyle Matter, Managing Director at KKR, emphasized the increasing data and computing needs and their potential environmental impact. He expressed a belief that liquid cooling plays a crucial role in reducing the emissions footprint of the digital economy.
Projections suggest that liquid cooling will witness significant growth, potentially capturing up to 26% of the datacenter thermal management market by 2026. This is driven by the deployment of more high-performance infrastructure. CoolIT, which is soon to be acquired, has already demonstrated its growth potential by securing a spot on the Financial Times’ list of fastest-growing US companies this year, ranking at number 218.
Alan Priestley, a former technical marketing manager at Intel and currently a VP analyst at Gartner, highlighted the necessity for many companies to invest in liquid cooling to address the challenges associated with managing high-performance servers. As processors become more powerful, liquid cooling offers an effective solution to address heat dissipation concerns and optimize server performance in datacenters.
According to Priestley, CPUs currently consume around 250W to 300W of power, while GPUs range from 300W to 500W. For servers handling demanding workloads such as AI training, those equipped with up to eight GPUs can draw as much as 7-10kW per node.
Priestley further explained that datacenters are striving to increase rack densities by incorporating more memory per node and higher-performance networking. Accommodating such heightened performance requirements necessitates increased power consumption.
Andrew Buss, a senior research director at IDC, concurred with this assessment. He emphasized that as chip or package power densities continue to rise, liquid cooling becomes a more efficient and preferred option.
Buss highlighted that support for direct liquid cooling loops is now being integrated into many modern datacenter facilities and colocation providers. He pointed out that companies like Atos/Bull have embraced direct contact liquid cooling loops for their power-dense high-performance computing (HPC) servers. This allows them to fit six AMD Epyc sockets with maximum memory, NVMe storage, and 100Gbps networking into a compact 1U chassis, all cooled by a custom cooling manifold.
The growing demand for higher performance and power-intensive applications is driving the need for efficient cooling solutions like liquid cooling in datacenters. By adopting liquid cooling technologies, datacenters can effectively manage the increasing power requirements of advanced processors and GPUs while maintaining optimal performance and mitigating potential heat-related issues.
According to Moises Levy, an expert in datacenter power and cooling research at Omdia, the global adoption of liquid cooling is expected to continue increasing.
Levy suggests that while liquid cooling has reached or is nearing a tipping point for specific applications with compute-intensive workloads, its widespread adoption in the broader datacenter market is still on the horizon. He highlights that direct-to-chip and immersion cooling technologies are likely to be the primary disruptors, projected to have the highest compound annual growth rate (CAGR) in the coming years.
Direct liquid cooling, supported by CoolIT, involves circulating a coolant, typically water, through cold plates directly attached to components like processors. This type of system is relatively easier to implement within existing rack infrastructure.
On the other hand, immersion cooling submerges the entire server node in a non-conductive dielectric fluid coolant. Specialized racks, some of which position the nodes vertically instead of horizontally, are typically required for this type of system. Immersion cooling tends to be favored for new-build server rooms.
As liquid cooling technologies continue to advance, their increasing adoption is expected to bring significant benefits to datacenters in terms of improved efficiency and enhanced cooling capabilities.
European cloud operator OVHcloud has developed a unique system that combines two cooling approaches for optimal performance. Their innovative solution involves utilizing water blocks attached to the CPU and GPU while employing immersion cooling with a dielectric fluid for the remaining components.
While OVHcloud currently reserves this system for their cloud infrastructure handling intensive workloads like AI, gaming, and high-performance compute (HPC) applications, they have indicated potential future expansion.
In a similar vein, GlobalConnect, a leading data center colocation provider, plans to offer immersion-based cooling as an option to all their customers. Teaming up with immersion cooling specialist GRC, GlobalConnect announced their system deployment in February. They aim to gradually introduce this advanced cooling technology across all 16 of their data centers located in Denmark, Norway, Sweden, Germany, and Finland, based on customer demand.
The question arises: Can liquid cooling help achieve sustainability objectives? OVH shared that its combined system is significantly more efficient than traditional air cooling methods. They claim that in tests, their cooling system achieved a favorable partial power usage effectiveness rating (PUE) of 1.004, which specifically measures the energy consumed by the cooling system.
However, Buss, an industry expert, urged caution in adopting liquid cooling and emphasized the need for careful consideration, particularly in waste heat management. He highlighted that implementing “liquid cooling done right” can certainly contribute to enhanced efficiency and environmental sustainability by reducing reliance on compressor-based cooling and leveraging heat-exchanger technology to maintain optimal cooling loop temperatures.
Nevertheless, Buss emphasized the importance of proper implementation, as simply discharging the heat into the environment, such as a lake or river, can have detrimental effects. Therefore, the design of the ultimate heat path should be carefully planned to maximize reuse opportunities whenever feasible.
The European Union (EU) has recently expressed its desire to see more cities utilizing waste heat from data centers to heat residential homes. However, challenges arise because the heat generated is often not at a sufficiently high temperature, necessitating additional energy consumption to address this limitation. Despite these obstacles, some data center operators, like QTS in the Groningen region of the Netherlands, have ventured into exploring such initiatives.
In the previous year, the United States Department of Energy made investments in projects aimed at reducing energy consumption for cooling purposes in data centers, albeit with a relatively modest funding amount of $42 million. Additionally, we highlighted the swift adoption of liquid cooling by Chinese data centers as a response to new government regulations.
Among the liquid cooling vendors that secured investments was Iceotope, a UK-based company that received £30 million ($35.7 million at the time) in a funding round led by a Singapore-based private equity provider, with a focus on penetrating the Asian market.
Intel also forged a partnership with Green Revolution Cooling to explore liquid immersion technology. However, the chip giant recently decided to halt its plans for a $700 million research and development lab dedicated to cooling technology in Oregon, as part of its cost-cutting measures.
Datacenters play a critical role in the digital age, serving as the backbone of our increasingly connected world. These centralized facilities house an extensive network of servers, storage systems, and networking equipment that enable the storage, processing, and distribution of vast amounts of data. As technology advances and data demands continue to surge, datacenters are evolving to meet the challenges of efficiency, scalability, and performance.
Hyperscale datacenters have emerged as the powerhouses of the digital infrastructure landscape. These massive facilities are designed to handle the most demanding workloads, supporting cloud services, AI, machine learning, and big data analytics. With their extensive computing power and storage capabilities, hyperscale datacenters are fueling innovation and driving digital transformation across industries.
As data-driven applications proliferate, the need for low-latency and real-time processing has become paramount. This has led to the rise of edge computing, a decentralized computing model that brings data processing closer to the source of data generation. Edge datacenters are strategically located in proximity to users and devices, enabling faster response times and reducing the burden on network infrastructure. This trend is particularly crucial for applications requiring real-time data analysis, such as autonomous vehicles, IoT devices, and augmented reality.
With the increasing energy consumption of datacenters, the industry is actively pursuing greener and more sustainable solutions. Datacenters are exploring innovative approaches to reduce their carbon footprint, optimize power usage, and increase energy efficiency. These initiatives include adopting renewable energy sources, implementing advanced cooling techniques, and optimizing server utilization through virtualization and consolidation. Green datacenters not only contribute to environmental conservation but also help organizations meet their sustainability goals.
Data security and privacy have become paramount concerns in the digital era. Datacenters house vast amounts of sensitive information, making them attractive targets for cyber threats. As a result, datacenters are continuously enhancing their security measures, implementing robust firewalls, encryption protocols, and intrusion detection systems. Compliance with data protection regulations such as GDPR and CCPA is also a top priority for datacenters, ensuring the privacy and confidentiality of user data.
The ever-increasing power density of modern servers has led to significant heat dissipation challenges. To overcome this, datacenters are turning to liquid cooling as an efficient solution. Liquid cooling systems, such as direct-to-chip and immersion cooling, offer superior thermal management, enabling higher performance and energy efficiency. By efficiently dissipating heat, liquid cooling minimizes the risk of thermal throttling and extends the lifespan of critical hardware components.
Datacenters are at the forefront of the digital revolution, enabling seamless connectivity, storage, and processing of data. As technology advances, datacenters are continuously evolving to meet the escalating demands for efficiency, scalability, and sustainability. From hyperscale datacenters to edge computing, green initiatives, security enhancements, and liquid cooling solutions, the datacenter industry is shaping the future of our digital landscape. By embracing these advancements, organizations can unlock the full potential of their data and drive innovation in the digital age.