Liquid cooling solutions are expected to enter more enterprise data centers. In this article, we will discuss five reasons with our readers. Liquid cooling, which has traditionally been used primarily for mainframes and academic supercomputers, may soon be available in more enterprise data centers. As new and more demanding enterprise workloads continue to push up the power density of server racks in data centers, data center operators are eager to find more effective alternatives to air cooling systems. We have interviewed a range of data center operators and vendors about their views on the mainstream adoption of liquid cooling, and some of the respondents did not want to reveal the specific applications they use in their data centers, claiming that they view these workloads and how they are cooled as a competitive advantage for their organization. A group of hyperscale cloud operators, including Microsoft, Google parent Alphabet, Facebook and Baidu, have formed a group dedicated to creating open specifications for liquid-cooled server racks, but the group has not yet said which specific solution they will use. However, at least one type of workload in these hyperscale data centers clearly requires liquid cooling: machine learning systems accelerated by GPUs (or, in the case of Google, its latest Tensor Processing Units, which the company has publicly said now use liquid cooling designs that cool the chips directly).
Despite the current concerns and trepidation of enterprise data center operators regarding the topic of liquid cooling adoption, some usage trends are beginning to emerge. If your enterprise supports any of the following workloads in your data center, then your data center may also adopt liquid cooling in the future: 1. AI and accelerators The rate of annual CPU performance growth described by Moore's Law has slowed dramatically in recent years, in part because accelerator processors (primarily GPUs), as well as FPGAs and specialized ASICs, are increasingly finding their way into enterprise data centers. GPU-driven machine learning may be the most common use case for hardware acceleration outside of HPC (high-performance computing). However, in a recent survey conducted by market research firm 451 Research, about a third of IT service providers said their companies plan to use the acceleration systems for online data mining, analytics, engineering simulations, video, other real-time media, fraud detection, load balancing and similar latency-sensitive services. Hardware accelerators have much higher thermal design points (TDP) than CPUs, typically consuming 200W or more to cool them; add high-performance server CPUs, and a single system in your enterprise data center will require more than 1kW of power to cool it. Intel is also actively breaking through the 150W power limit of its traditionally designed server processors. "More and more enterprise customers want more powerful chip products, and we are beginning to see the number of watts consumed by these chip products gradually increase," said Andy Lawrence, executive director of the Uptime Institute. The rack density of current enterprise data center servers is constantly increasing. Most data centers now have at least some racks over 10kW in normal operation, and 20% of the racks even have 30kW or higher power density racks. But these workloads are not considered high-performance computing. "They just mean that their workloads have higher-density racks," Lawrence said. "If you put a GPU together with an Intel processor, they could potentially get three times the power density they had before," he said. Liquid cooling is an obvious fit for these accelerators, especially immersion cooling, which can cool both the GPU and the CPU. 2. Cooling high-density storage As storage density continues to increase in enterprise data centers, it may become more difficult to cool storage effectively. Most of the storage capacity installed in data centers is composed of non-sealed hard disk drives, which cannot be cooled by liquid solutions. However, newer technologies bring hope to enterprise users in the industry in this regard. For example, solid-state drives can be cooled using a full immersion solution. In addition, the helium created to support high-density, high-speed read/write heads in the latest generation of storage hardware requires sealed units, making them suitable for liquid cooling solutions. As noted in a report published by 451 Research, the combination of SSDs and helium-filled hard drives means there's no need to separate air-cooled storage from liquid-cooled treatments. The HDD reliability boost comes with a bonus: Submerging the drives in coolant can help reduce the effects of heat and humidity on components. 3. Network edge computing The need to reduce latency for current and future applications is further driving the need for a new generation of data centers at the edge of the network. These can be high-density, remote facilities deployed in wireless towers, factory operations, or retail stores. And these facilities may increasingly host high-density computing hardware, such as GPU-packed clusters for machine learning. While not all edge data centers will be liquid cooled, many will be designed to support heavy workloads in confined spaces where traditional cooling solutions cannot be used, or to implement cooling in new deployment environments where traditional cooling prerequisites are not available. Liquid cooling makes it easier to deploy edge sites in places without large-capacity power supplies due to reduced energy consumption. According to Lawrence's forecast, up to 20% of edge data centers could use liquid cooling solutions. He envisions remote micro-modular high-density data center sites supporting 40kW per rack. 4. High-frequency trading and blockchain Many modern financial services industry enterprise workloads are compute-intensive and require high-performance CPUs and GPUs. These workloads include high-frequency trading systems and blockchain-based applications such as smart contracts and cryptocurrencies. For example, Green Revolution Cooling (GRC), which has a high-frequency trading firm as a corporate customer that is testing its immersion cooling solutions, experienced its largest sales surge ever when it launched its immersion cooling products for cryptocurrency mining and the price of bitcoin soared starting in late 2017. GRC CEO Peter Poulin told CNN Business that another GRC customer in Trinidad and Tobago is running a cryptocurrency service at 100kW per rack and connecting a warm water cooling loop to an evaporative tower. Since warm water cooling is more energy efficient than cold water, the cooling solution can operate in tropical conditions without mechanical chillers. 5. Traditional cooling solutions are expensive When air-based cooling systems cannot handle high-density cooling needs, liquid cooling solutions begin to make sense. For example, geoscience company CGG uses GRC's immersion liquid cooling system to provide cooling for its data center in Houston, where CGG mainly processes and analyzes earthquake-related data. They use powerful GPUs on commercial servers, consuming up to 23kW of power per rack. This power density is relatively high, but this density is usually cooled by air. Ted Barragy, senior systems manager at CGG, said: "We put heavy computing servers into immersion tanks for cooling. But in fact, it is not so much to meet the workload of the application, but rather to say that the immersion liquid cooling solution is more cost-effective. During the upgrade, immersion liquid cooling replaced the traditional cooling equipment that CGG used in its old data center. According to Barragy, the team recovered several megawatts of power capacity as a result of the upgrade. "Even after adding the servers and the immersion tanks for a few years, we still have half a megawatt of power resources that are not being used," he said. "This is an old traditional data center, and about half of its power consumption is used for inefficient air cooling systems." Barragy also said the immersion-cooled data center has a PUE of about 1.05, which is better than the company's other new air-cooled data center in Houston, which has a PUE of 1.35. "A lot of people think that liquid cooling is only for really high-density cooling solutions where the compute power density per rack is 60kW to 100kW, but for our mainstream enterprise customers, there are other significant advantages to this solution," Poulin said. Chris Brown, chief technology officer at Uptime Institute, said that they have seen a general increase in interest in liquid cooling solutions in the industry, which is driven by the current urgent need for enterprise data centers to achieve higher energy efficiency and lower operating costs. "The focus on liquid cooling is no longer around ultra-high density, but rather something that general enterprise data center operators can use to cool any IT asset," he said. "It's now finding its way into more common density solutions and more common data centers." |
TmhHost has launched this year's Double 11 pr...
[51CTO.com original article] The most tense and e...
After entering the URL in the browser, the websit...
According to foreign media, former NASA administr...
According to foreign media reports, South Korea...
1. What is AP? Answer: AP - Wireless Access Point...
Over the past few years, we’ve seen a lot of head...
TmhHost launched a number of special-priced indep...
Since April 2021, my country's 5G development...
Megalayer's 618 promotion officially started ...
2020 is the year when 5G enters large-scale appli...
EtherNetservers is a foreign hosting company that...
Hello everyone, I am Xiaozaojun. In today’s artic...
Recently, the U.S. aviation and telecommunication...
Preface I've been reading about HTTP recently...