- Current Landscape of AI and Data Centers
- Immersion Cooling and AI
- Technological Requirements for AI Data Centers
- Planning and Design Considerations
- Cost Effectiveness and ROI
- Future Trends and Innovation
- AI and Data Centers Are Evolving. Are Your Cooling Solutions?
Artificial intelligence (AI) and data centers are converging as one of the technology industry’s key focal points. As resource-intensive AI technologies grow, data centers must adapt and find effective ways to manage the increased demand. Of course, this creates significant challenges, particularly for cooling needs.
Data centers are an increasingly important element of IT infrastructure. With commerce migrating to online channels at ever-rising rates, data centers play a central role in economic health and growth. As such, data center operators face pressing needs for efficient and cost-effective cooling solutions to avoid costly downtime, service interruptions, and potential damage to critical equipment.
These needs will only grow as artificial intelligence technologies continue to develop and integrate into the broader global economic framework. For context, consider these facts about AI and data centers:
- AI’s processing demands draw three times the electricity required by conventional computing.
- Data center operators face triple the typical energy costs to meet the energy requirements of AI deployments.
- The density requirements of AI and data centers are 15 times greater than those for standard cooling capacities.
Given the challenges created by these factors, data center operators need effective and practical solutions. This is precisely what immersion cooling offers.
AI and Data Centers: Why Immersion Cooling Matters
Immersion technologies cool computing equipment far more efficiently than conventional methods like forced-air and water-based cooling. They also demand far less physical space and consume far fewer resources. As a result, data centers can cool more equipment at a lower cost, so they can meet the AI-related needs of today as well as tomorrow.
At the same time, AI and data centers require careful integrational design, planning, and future-proofing. Data center operators must also consider cost effectiveness and the potential for a positive return on investment (ROI) as they plan, build, and choose technological systems for the data centers of the future.
Let’s look at the central considerations involved in data center operations in the age of artificial intelligence.
Current Landscape of AI and Data Centers
Artificial intelligence has been on the tech industry’s radar for decades. Until recently, it was considered a developing technology that would mature at some point in the future. That changed quickly and in dramatic fashion in late 2022, with the arrival of generative AI tools like ChatGPT and Stable Diffusion.
The incredible capabilities of emerging generative AI systems sparked a flurry of interest in artificial intelligence and machine learning (ML). Businesses across industries quickly developed plans to integrate AI into their operations. Investors flocked to both established and emerging players in the AI space, making them flush with investment capital. As a result of these influences, AI has exploded into the cultural, technological, and economic mainstream.
Emerging Use Cases for AI Technologies
AI is no longer relegated to the realm of developmental speculation. It’s now a fully functional aspect of contemporary computing, with a novel and fast-evolving set of exciting use cases. These broadly include:
Generative AI
Generative AI turns user-submitted prompts into original output. It creates text, images, music and sounds, video, and other forms of media. Commercial applications extend to many areas, such as:
- Visual design
- Content creation
- Augmented reality (AR)
- Virtual reality (VR)
- Digital gaming
While generative AI still has its limitations, the technology can create surprisingly high-quality output. This is especially apparent in a class of technologies known as generative adversarial networks (GANs), which use advanced ML tools to edit and self-correct generated content.
Edge Computing
Edge computing is an IT architectural model in which a client or user’s data gets processed around peripheral points, known as “edges,” in the wider computing network. It’s emerging as an increasingly attractive option with respect to data centers, especially when paired with artificial intelligence technologies.
AI-powered processing algorithms make smart, efficient decisions about their use of edge computing resources. Among other areas, the convergence of AI and edge computing stands to impact the Internet of Things (IoT), mobile computing, and autonomous vehicle technologies.
Automation and Robotics
Businesses have moved quickly to integrate AI into their service channels. For instance, AI-powered chatbots offer an effective, cost-controlled way to answer customers’ questions. They can guide shoppers to appropriate products and services and help resolve issues through basic troubleshooting.
AI-powered automation and robotics tools are also reshaping the global manufacturing industry. Robots with integrated AI capabilities can quickly adapt and respond to new environments and working conditions. This makes them far more versatile and capable than legacy industrial robotics technologies.
Personalized Medicine
AI and ML models have the unique ability to analyze and draw insights from enormous quantities of data. Both can carry out these functions quickly and accurately, which opens a world of new possibilities in personalized medical treatments.
Advanced computing systems can help doctors and healthcare providers diagnose conditions and diseases, select medications and treatments, monitor patient progress, and model health outcomes. Because AI-powered healthcare tools can also analyze patient data on mass scales, they stand to have a major impact in the public health arena.
Cybersecurity and Information Security
Cybercrime has a stunning economic impact, with one estimate showing that it cost the global economy $8 trillion in 2023 alone. That massive number is poised to continue rising. So, cybersecurity providers need powerful and effective new solutions to combat soaring crime rates.
Algorithms powered by AI technologies can detect suspicious activities and signals of an impending cyberattack with unmatched precision and speed. They can also mount effective responses to active threats and adapt to shifts and changes in cybercriminal activities while an attack is underway. When deployed alongside capable human cybersecurity professionals, AI creates a daunting buffer that can prevent damaging attacks outright.
At the same time, cybercriminals are expected to use AI technologies to make their scams and attacks more sophisticated. As such, AI-powered cybersecurity tools could be in a unique position to help human personnel identify and address these dangers and threats.
AI and Data Centers: Key Impacts
From an operational perspective, one of the main impacts of AI and data centers relates to computational requirements. AI deployments are extremely power-intensive. According to a recent study, the current wave of generative AI technologies uses 10 to 15 times as much energy as standard central processing units (CPUs).
Data processing requirements vary depending on the nature of the application, but they well exceed those associated with standard computing. In fact, one report found that AI’s data center density requirements were six to 12 times greater than established averages.
To meet the energy requirements and data processing demands of AI technologies, data centers will require both advanced and powerful computer hardware and energy-efficient management systems.
Additional data center impacts of AI relate to:
- Decentralization: AI deployments demand low-latency data processing in real time. This has vaulted edge computing into the spotlight. It’s also activated nearby server networks and IoT-connected devices to manage data processing needs. These decentralized processing models are likely to become the norm, forcing IT infrastructure providers to reimagine their architecture.
- Cybersecurity: IT infrastructure operators and data centers will both require AI-driven cybersecurity plans. AI holds the potential to power novel security capabilities, but bad actors can also exploit the tech for their own means. Industry observers believe AI will mark a major new cybersecurity battlefield in the years to come.
- Network Automation and Optimization: AI’s forthcoming integration into network monitoring and management will automate many tasks and help optimize IT resources. But it will also contribute to data centers’ soaring energy needs and processing requirements.
As data centers plan for an AI-enriched computing future, immersion cooling has emerged as a powerful solution to several of the challenges operators currently face.
Immersion Cooling and AI
With respect to AI and data centers, immersion cooling offers three important and direct benefits. First, it helps power dramatic increases in rack density. So, data centers can pack far more processing power into the available space.
Second, immersion cooling offers performance advantages to the powerful, high-efficiency hardware and computing components required for AI applications. Third, immersion systems use far less energy than legacy forced-air and water-cooling technologies. These cooling methods are also impractical for AI because of their resource needs and spatial demands.
AI and Data Centers: How Immersion Cooling Supports Increased Rack Density
Liquid-based immersion systems deliver cooling directly to server racks, allowing data center operators to make much better use of available space. By comparison, forced-air cooling systems and other legacy cooling technologies are space-intensive. This negatively impacts rack density because it limits the amount of room available for hosting servers.
Green Revolution Cooling (GRC) made headlines in 2021 when we deployed our proprietary immersion cooling technologies to achieve extreme rack density. At the time, high-performance data centers posted densities of approximately 30 kilowatts per rack. GRC demonstrated modules capable of generating densities of 200 to 368 kilowatts per rack. That’s 12 times better than other high-performance technologies at the time.
Immersion Cooling Supports Higher Performance
Immersion cooling also offers multiple hardware performance benefits. Because it dissipates heat more efficiently than traditional air cooling, immersion cooling enables hardware to operate at lower temperatures. It also eliminates hot spots. Both of these features facilitate faster, more responsive, and more reliable computing functions.
The uniform nature of the cooling delivered by immersion systems has similar effects. Temperatures remain consistent throughout the hardware. This boosts overall performance while reducing the risk of a performance impairment known as thermal throttling.
Thermal throttling occurs when a CPU runs at hotter temperatures. When this occurs, the unit’s internal clocking mechanism slows down to prevent further heating and reduce the risk of overheating. This, in turn, prompts the CPU to run at lower speed. Immersion cooling reduces the likelihood of thermal throttling, so components operate at peak speeds.
Energy Savings and Immersion Cooling: Making AI Data Centers More Sustainable
AI technologies consume enormous quantities of energy. As they scale up and become more deeply integrated into the computing mainstream, data center energy requirements will rise in tandem. This creates both cost challenges and sustainability impacts.
Of the available data center cooling systems, immersion technologies hold the greatest potential to save the most energy. In fact, immersion cooling reduces energy consumption in multiple ways:
- Immersion-cooled data centers don’t require the large, energy-intensive air conditioning and fan systems used in air cooling.
- The liquid coolants used in immersion systems have much higher heat capacities than air. This enables them to absorb and remove heat far more efficiently and with less energy.
- Immersion cooling minimizes energy waste by generating a lower overall power usage effectiveness (PUE) ratio.
- Complex thermal management systems aren’t as urgently required in data centers that use immersion cooling, thanks to their ability to achieve stable and uniform operating temperatures.
Finally, immersion cooling facilitates higher ambient temperature operations without putting the safety or performance of cooled components at risk. This allows data centers to reduce their overall cooling-related energy expenditures, saving both money and resources.
Technological Requirements for AI Data Centers
Immersion cooling is one of two main types of liquid cooling relevant to AI and data centers. Direct-to-chip (DTC) cooling is the other. Data center operators should understand the technical differences between the two.
Types of Immersion Cooling
Immersion cooling uses two main models: single-phase (or one-phase) and dual-phase (or two-phase) cooling. Single-phase cooling is more common. It uses specially engineered coolant fluids, which remain in their liquid state throughout the cooling process. This fluid absorbs the heat generated by computing components and circulates it to an external system. The heat is then dissipated, eliminated, or sequestered for reuse.
Several distinct advantages have led to single-phase immersion cooling’s dominance. Single-phase systems have simpler designs, which enhances its simplicity and reliability. They are also energy-efficient, easier to implement, and compatible with a wide range of IT hardware.
Direct-to-Chip Cooling
Also known as liquid cooling or water cooling, DTC cooling delivers specialized coolant fluids directly to a hardware unit’s heat-generating chip components. The fluid absorbs heat and transfers it away from the computing components to a cooling block or heat sink.
DTC cooling has two considerable drawbacks. First, it carries a risk of leaking coolant directly into sensitive electronic components, which can damage or destroy hardware. Second, it has only limited heat dissipation coverage because of its sole focus on cooling a narrow and specific set of components. So, it has significant limitations in high-density data centers, which require the uniform cooling of multiple components to operate at peak efficiency.
Beyond choosing a cooling technology, AI and data centers have additional technical requirements. Major examples include data processing and storage needs, plus the logistical challenges associated with meeting user demands for AI technologies.
AI and Demand for Data Storage and Processing
One of the most disruptive aspects of AI and data centers relates to demand volatility. The global data center industry has already had significant problems creating accurate capacity projections and planning models over the past decade.
Industry analysts expect AI to intensify this volatility to a significant degree. After all, emerging applications have already shown an ability to draw in large user volumes in very short periods of time.
Predictive analytics, which are powered by AI technologies, can help data center operators plan to meet storage and processing requirements. While precise capacity requirements remain unclear as AI technologies continue to emerge, data center designers should uniformly consider them to be significantly higher than present levels. Data centers currently considered to have extreme density profiles could become the norm in relatively short order.
Logistical Requirements for AI and Data Center Integrations
For corporate enterprises, legacy approaches to data center planning typically involve building and maintaining large-scale data centers exclusively for their own needs. This model remains common in many industries, especially those subject to elevated compliance and data protection requirements. Examples of such industries include insurance, financial services, and healthcare.
Yet major businesses in these and many other industries are increasingly adopting software-as-a-service (SaaS) models. SaaS has become a dominant aspect of cloud computing, especially since the COVID-19 pandemic.
During the pandemic, a general shift occurred in which large businesses began migrating from private data centers to cloud and multi-cloud models. This approach, known as colocation, holds a strong appeal for businesses that make extensive use of AI and data centers. Specifically, colocation offers:
- Comprehensive network connectivity
- Low-latency and ready access to advanced, high-performance computing networks
- Reduced data transfer times
- Excellent scalability
As AI becomes more deeply entrenched in everyday computing, it will also force data centers to adopt more advanced and specialized forms of hardware. These computing systems are larger and more powerful than their conventional counterparts, and they also generate far more heat. As such, immersion cooling is rapidly emerging as an essential part of future-proofed, AI-compatible data centers.
Planning and Design Considerations
Immersion cooling offers unique advantages in relation to AI and data centers. For one thing, it liberates designers from the need to accommodate the space-intensive infrastructure demanded by forced-air cooling and other legacy solutions. As a result, it allows for significantly more flexibility in deploying AI solutions.
Data centers have traditionally favored the use of forced-air cooling systems, so those will serve as the main point of comparison. Planning and designing a data center that uses forced-air cooling solutions limits the range of available facility options. That is, you need buildings that have the space and ventilation infrastructure to accommodate massive fan networks and air conditioners. In the absence of such an option, the data center facility would require costly and time-consuming retrofitting.
In contrast, integrating immersion cooling into data centers at the design level opens up many alternative possibilities. A facility only requires a small handful of physical features to accommodate an immersion-cooled data center. This includes a water loop system, along with access to electricity and a computer networking infrastructure. In this manner, immersion cooling supports what’s known in the data center industry as deployment location flexibility.
Immersion Cooling, AI and Data Centers: Physical Layout and Space Allocation Considerations
To integrate immersion cooling systems into data centers at the design stage requires a careful analysis of multiple factors related to immersion tanks and their infrastructure. These primarily extend to:
Dimensional Considerations
Designers must generate accurate estimates of the hardware sizes and quantities that will be housed in the data center. This, in turn, allows them to project the dimensions of the immersion tanks that will be required to cool them.
Tank Placement
In planning the placement of immersion tanks, designers must consider factors such as:
- Floor space requirements and availability
- Maintenance access
- Proximity to power supplies
Cooling Fluid Circulation Paths
Liquid coolant circulation paths also demand careful planning to ensure heat is transferred efficiently from the computing components to the cooling fluid. The paths require optimization to deliver uniform levels of cooling to all components. Additional considerations include flow direction and strategies for managing potential hot spots.
Safety Clearances
Regulatory requirements stipulate that immersion tank placements account for safety clearances. Tanks must be positioned to allow emergency crews to access the facility and to minimize or eliminate the risk of accidental contact by site personnel.
Electricity Infrastructure
Planning must account for the placement of the power sources that will serve both the immersion tanks and the computing units. These power sources must occupy locations that are safe and accessible.
Sensors and Monitoring Systems
Immersion cooling systems require the precise, round-the-clock monitoring of tank conditions, coolant temperatures, and overall performance. So, designers need to consider the placement of sensors and monitoring equipment during the initial stages of facility planning.
Expansion and Scalability
Experts widely project AI and data centers to grow at dramatic and interlinked rates in the years ahead. If space permits, facility designers should also address scalability and the possibility of future expansion into their site planning.
Cost Effectiveness and ROI
Cost effectiveness and ROI drive many decisions that impact AI and data centers. First and foremost, businesses must consider the fact that AI deployments are very expensive. They need to maximize their cost effectiveness across every other aspect of the related operations.
Businesses using AI technologies to drive their revenues must address these considerations with added attention and urgency. In these cases, individual components in the AI deployment must be running at peak efficiency on a 24/7/365 basis. After all, they’ll play an ongoing and critically important role in generating income for the enterprise.
To those ends, it’s important for businesses to consider the many ways in which immersion cooling technologies support cost savings with respect to AI and data centers. These include:
Energy Efficiency and Reduced Electricity Costs
Data centers that incorporate immersion cooling as their primary heat management strategy use far less energy than their legacy forced-air counterparts. In addition to dramatically reduced electricity requirements, immersion cooling also uses energy more efficiently.
Consider the following statistics:
- Cooling accounts for approximately 40% of the typical data center’s total energy consumption.
- Air cooling only captures about 30% of the heat emitted by servers.
- Immersion cooling functionally captures 100% of that heat.
What’s more, immersion cooling enables data center operators to remove the internal server fan networks used in forced-air systems. This alone accounts for an energy load reduction of 10–15%.
A 2023 study published in Energy Informatics reported that immersion cooling holds the potential to reduce overall data center electricity consumption by up to 50%. That translates into enormous operational cost savings over the lifespan of a typical AI deployment.
Superior Heat Dissipation Capacity
Compared to forced-air cooling systems, immersion technologies have much higher heat dissipation capacities. The superior efficiency can reduce or even eliminate the need for adjacent cooling solutions, which further reduces costs.
Space Optimization
Data center operators stand to generate additional savings through space optimization. Immersion cooling supports more compact server designs and layout placements. Data centers can then make smarter and more efficient use of limited floor space.
As the data center’s physical footprint diminishes, so do its operating costs.
Higher Density
Density marks one of the most pressing considerations impacting AI and data centers. The current generation of AI technologies already demands far more density than conventional computing. These density needs will only intensify as artificial intelligence advances and becomes more complex.
Immersion cooling lets data centers pack more computing power into less space. The per-unit cost of computational power falls as a result, giving immersion cooling a far superior cost profile relative to available alternatives.
Increased Lifespan of Computing Hardware
Immersion-based heat management systems allow servers and other forms of computer hardware to operate at lower temperatures for longer periods of time. This reduces the overall stress levels, extending their lifespans and helping operators delay or avoid costly replacements.
What About Capital Investment?
The powerful cost-saving benefits offered by immersion cooling come with caveats. One of the most significant examples relates to up-front capital investment requirements. While they’re much cheaper to operate, immersion cooling systems are relatively expensive to build and implement compared to forced-air strategies. However, those higher up-front investments deliver powerful returns that extend beyond the aforementioned long-term savings on operating expenses and equipment costs.
As AI and data centers proliferate, data center industry experts predict that forced-air cooling systems will become increasingly rare before growing obsolete. So, businesses and data center operators should closely consider whether it makes sense to commit significant amounts of investment capital to cooling technologies that are likely to suffer from declining levels of utility in the near-term future.
An investment in immersion cooling is also an investment in future-proofing. In many cases, it makes more sense for data centers and businesses to invest in immersion technologies now. Doing so may well help them avoid expensive retrofitting projects or facility upgrades in the future.
Future Trends and Innovation
Planning for the AI and data centers of the future demands a careful evaluation of emerging and evolving trends and technologies. To that end, businesses with AI-adjacent operations and data center operators need to consider impending advancements in artificial intelligence. Then, there are also upcoming immersion cooling innovations specifically designed to meet the evolving needs of AI workloads.
AI Technology: Upcoming Advancements and Their Data Center Impacts
Generative AI tools like ChatGPT and Stable Diffusion are only the tip of the iceberg when it comes to where artificial intelligence is heading. In January 2024, TechTarget published a review of how AI technologies are likely to develop during the remainder of the 2020s. The article cited strong, direct impacts on five key industries:
- Education: Experts predict educational content will become heavily personalized and integrated into custom learning plans for individual students.
- Finance: AI is already having a major impact on investing. Specifically, it powers automated trading algorithms and helps traders and investors select investments, manage risks, and execute strategies. As it expands, AI will likely also redefine financial planning, insurance, fraud detection, and consumer credit monitoring.
- Healthcare: Doctors and nurses are expected to integrate AI into diagnostics, treatment planning, and patient monitoring at increasingly robust rates. Predictive AI technologies may also be used to anticipate potential health problems in individual patients. They’ll likely play an increasingly high-profile role in protecting patient data and privacy as well.
- Law: AI technologies could displace lawyers from the labor force in huge numbers as attorneys make increasing use of artificial intelligence to conduct research, draft contracts, and plan litigations and legal arguments.
- Transportation: AI-powered smart grids have long been poised to transform urban transportation systems. Those technologies are finally on the cusp of entering the mainstream. Artificial intelligence is also expected to power advancements in autonomous vehicle technologies, which to this point have remained mired in the middle stages of their development.
These are but a few of many illustrative examples of how AI could become increasingly integrated into everyday life at accelerating rates. This, in turn, creates multiple additional considerations for data center designers and operators.
Density Demands Will Continue Rising
In November 2023, Silicon Valley Power projected that data center loads would nearly double from current levels by 2035. Achieving much higher compute densities appears to be the only way data centers will be able to keep up with future user demands.
Immersion systems optimize space and offer scalability without the need for additional temperature control infrastructure. As such, they pose the clearest path to higher data center densities of any cooling technology currently on the market.
Retrofitting Existing Data Centers
Major tech industry players like Meta and Microsoft are already reimagining their data center operations with AI in mind. As the challenges associated with AI and data centers continue to intensify, many tech insiders believe a wave of facility retrofitting projects will sweep across the industry.
Retrofitting data centers to accommodate immersion cooling is a complex and expensive process. However, the associated capital investments could become unavoidable. Legacy cooling technologies will likely diminish to the point of impracticality as AI and data center workloads soar in the coming years.
Immersion Cooling Will Go Mainstream
Given the vast scope and fast pace of change that AI will force, immersion cooling is likely to emerge as a standard feature of next-generation data centers. Of course, the technology is in its relative infancy and is currently considered somewhat exotic. But impending innovations are poised to guide its continued growth as AI workloads continue to intensify in data centers globally.
Immersion Cooling Innovations for AI Workloads
Tech-oriented businesses continue to actively seek AI angles in an ongoing bid to capitalize on the technology’s promise. Such efforts mark a significant driver of the continued growth and evolution of AI workloads.
Immersion cooling technologies are also advancing to meet the data center industry’s changing needs. Some particularly notable emerging developments and innovations include:
Expansion Into Multi-Tenant Colocation Centers
Many data center industry experts believe that the relationship between edge computing, AI, and data centers will spark significant growth in immersion cooling’s adoption rates. While this trend isn’t directly related to immersion cooling technology, it does offer meaningful insight into a potentially major driver of its growth.
Edge computing is mainly used in multi-tenant colocation centers, which are typically situated in or on the periphery of the cities where tenants are based. It facilitates the processing of client data closer to its origin point. This reduces latency, improves bandwidth efficiency, and boosts the performance of applications that require fast data processing speeds. Edge computing also offers privacy, security, reliability, and resilience benefits.
Conventional forced-air cooling systems aren’t feasible for edge deployments. Edge computing facilities are typically located in urban areas, where space is at a premium. It’s difficult to accommodate the large and resource-intensive infrastructure that air-based cooling strategies require.
What’s more, dust and other particles tend to be a bigger problem in data centers located in urbanized areas. Air cooling circulates these contaminants around facilities, creating performance risks and the potential for equipment damage. Large-scale air filters offer a solution, but they’re very resource-intensive to clean and maintain.
For these reasons, industry insiders widely consider immersion cooling to be a much better option for edge deployments. With AI and data centers poised to drive huge growth in edge-enabled colocation facilities, immersion technologies could see huge growth as a result.
Forced Convection Heat Sinks
Industry innovators recently paired forced convection heat sinks with single-phase immersion cooling. In doing so, they achieved significant performance improvements. This also foretells another potential direction that liquid cooling’s ongoing advancement might take.
Legacy-forced convection heat sinks use air, which is circulated over the heat sink’s surface to accelerate the rate of heat transfer. But engineers noted that the specialized fluids used in immersion cooling can absorb up to 1,000 times as much heat as air. So, they adapted the standard forced convection heat sink model to integrate liquid coolants.
The promising results broke established performance barriers. It’s a clear sign of how system design, engineering innovation, and emerging coolant technologies can come together to take immersion cooling to new heights of utility and feasibility.
AI and Data Centers Are Evolving. Are Your Cooling Solutions?
AI deployments are poised to put immense pressure on data center infrastructure. Major improvements in rack density represent one of the only feasible solutions, but conventional cooling methods simply can’t achieve it.
Data centers need to evolve, and innovation is a pressing necessity. Immersion technology offers an ideal solution to the cooling needs of the data centers of tomorrow. It directly supports extreme data center density by optimizing space. At the same time, it supports faster, more efficient cooling and the advanced hardware performance that artificial intelligence demands.
GRC has gained widespread industry recognition as a leading force in immersion cooling technology. Our immersion systems can help dramatically improve your data center’s performance and data processing capabilities. To discuss your cooling needs in detail, contact us today.