Data centres: embracing the impact of AI

Data centres: embracing the impact of AI

As AI fuels the need for ever more data, we require data centres to store and process all the information securely. Businesses which contribute to the efficiency of data centres will have the upper hand.

The rise of artificial intelligence (AI) applications is fuelling the need for computational power and storage capacity. It is paramount to businesses, individuals, and governments for all this data to be processed and stored in a secure way. That means growing demand for sophisticated data centres, which we believe presents attractive investment opportunities – both in the data centres themselves and the infrastructure on which they rely.

Data centres have become crucial in meeting these escalating requirements by providing the infrastructure necessary to handle the massive AI workload. In the past decade, developers have steadily increased the capacity of new colocation datacentres – which house servers and networking of many companies in a shared facility – as well as of hyperscale ones – purpose-built facilities designed to accommodate the massive scale and high-performance requirements of large technology companies and cloud service providers.

The need for ever more capacity is set to increase further with the rise of Generative AI (GenAI), leading to a new cycle of capital expenditure (capex) from the hyperscale companies and demanding substantial investments in data centre infrastructure for several years to come. In order to continue storing information in a safe and secure way, the data centre infrastructure will need to evolve to cope with the extra power required by AI and ensure data is not lost due to overheating, power outages or fires.

Training vs inference

There are two key AI models, each with distinct data centre requirements:

  • AI training: Building a model to recognise patterns and make predictions based on input data. It can be effectively carried out within a relatively isolated setting as it is less latency sensitive. AI training can be done using a data centre situated in a rural area, benefitting from lower land costs.
  • AI inference: Generating predictions or outputs from the knowledge acquired from the input data (i.e. ChatGPT). It demands exceptional performance and minimal latency to enable real-time interaction between end users and applications. To meet these stringent requirements, a data centre located in an urban environment serves as a prime example of a facility suitable for AI inferencing.

Although AI training models are currently more in the news, we believe that the potential for colocation operators lies in the realm of AI inferencing. This could prove to have an addressable market that is some 10 to 15 times larger than that for AI training.1 Inference data centres require half the power density compared with AI training and will likely need to be replicated in 20-30 locations across the world. 

Power demand: a bottleneck for data centres

Data centre construction faces two critical challenges: land availability and power constraints. As a result, demand is far outstripping available supply – a trend which we expect to continue for the foreseeable future.

Power availability is particularly problematic since utilities are typically equipped to handle linear increases in demand, whereas step-function increases required by AI data centres pose a unique challenge (see Fig. 1). In 2023, the global data centre market reached 60GW of power consumption. This is expected to double to 122 GW by 2027, representing a compound annual growth rate (CAGR) of around 20 per cent.2

Fig. 1 - Hungry for power

Data centre power supply and demand, actual and forecast, GW

Source: Oppenheimer & Co Inc, data covering period 01.01.2021-31.12.2023. Forecasts for subsequent years based on historical data.

The adoption of AI applications is one of the key reasons for the surge in data centres’ power demand. The energy density3 of AI servers is significantly higher than that of traditional central processing unit (CPU) servers. They rely on graphic processing unit (GPU) servers which require five times more power than traditional CPU servers and in turn generate five times more heat.4

It is estimated that 80 per cent of the data centre power will be consumed by AI over the next 15 years, making the access to power a key differentiator.5

Data centres currently have an average power density of around 10kW per rack. AI applications requiring higher power density due to the use of GPUs, means hyperscale data centres anticipate that the average will move up to 40-50kW  per rack in the coming years.6

As AI adoption increases, data centre operators will need to upgrade their electrical infrastructure (i.e. power management from local electrical grid to chip) but this can take several years to implement. One way to speed up the process is to build in areas where the required infrastructure already exists, even if it means retrofitting or replacing existing structures. 

The future of data centre design

The higher power usage will also require more heating, ventilation, and air conditioning (HVAC) equipment given the increased heat generated. Data centres will thus need to invest in their thermal systems (i.e. cooling systems, airflow management).

That’s because AI applications generate substantial heat during operation, requiring data centres to maintain optimal operating temperatures to prevent hardware failures and ensure reliable performance.

Fig. 2 - Liquid cooling

Data centre thermal management total adressable market, USD bn

Source: Dell’Oro, Liquid cooling forecast, 2023-2028E.

Today, most data centres use air cooling. But the rising density of servers is surpassing the cooling capabilities of air-based systems, with air-based cooling limited to 15-25kW per rack after which they lose their effectiveness.As power density increases beyond that limit, operators will need to start looking at liquid cooling technologies as the more viable option.

Until recently, data centre design has kept up with power density requirements through localised cooling techniques. Now there is a greater focus on larger scale liquid cooling solutions to accommodate the greater rack densities needed for GenAI, though many data centre operators are not yet changing their baseline designs, which they find can meet current AI requirements. As GenAI becomes more embedded into our daily lives, we expect to see an increased focus on cooling and retrofitting of older facilities.

As a result, the total addressable market (TAM) for data centre liquid thermal management could increase by some five times in the next five years (Fig. 2). Companies providing cooling systems – or relevant equipment – could thus make interesting investments.

The importance of effective cooling systems, for security purposes, was highlighted when a data centre fire occurred in Strasbourg in 2021, which disrupted millions of websites, including government ones, and resulted in substantial data loss.

A tailwind for the industry

We already saw a glimpse of the growing demand in 2023, which witnessed an unprecedented period in the history of data centres. Over 6 gigawatts (GWs) of incremental capacity was leased globally, with the majority, around 4 to 5 GWs, in North America.8 The leasing volume was twice the amount recorded in 2022,  which was already an all-time high, and nearly eight times the volume observed in 2019.9

The supply constraints and strong demand have in turn driven up rents for the data centre players. Following an 18.6 per cent year-over-year rise in rents in 2023, real estate experts CBRE expect another double-digit percentage increase in 2024 (Fig. 3).

Fig. 3 - Rental income

Average asking rental rate for data centres for primary markets, USD/kW/month, and y/y % change 

Source: CBRE Research, 2023. Rental rates are quoted asking rates for 250-500 kW for Tier 3 data centres (with multiple paths for power and cooling systems) the following year. 

The data centre sector is set to experience exponential growth in the coming years due mainly to the widespread adoption of AI technologies. This growth is primarily driven by the need for extensive data processing capabilities and high-performance computing infrastructure. Additionally, the demand for effective cooling solutions, such as liquid cooling, will rise to ensure the reliability and efficiency of AI hardware.

Data centre companies that can adapt and scale their infrastructure to meet these evolving demands will be well-positioned to harness the deployment of AI. By embracing these changes and seizing this opportunity, they can play a pivotal role in shaping the future of innovation – and in ensuring the security of the data they store.

Data centres play a vital role in the protection of critical assets for companies, making them an attractive segment in the investment universe of Pictet Asset Management’s Security strategy.

Wells Fargo, Generative AI Brings Opportunity to Data Centre Future
Morgan Stanley Research estimates
Energy density refers to the amount of power consumed or required per unit of space within the facility. It measures how much energy is needed to operate the servers, cooling systems, and other equipment in relation to the available floor area.
Vertiv Capital Market Day 2023
DigitalBridge Group CEO Marc Ganzi, Q2 2023 earnings call
6 JLL Data Centres 2024 Global Outlook
7 The Green Grid
8 Wells Fargo, Data Centre Demand Hits New Heights in Q4 2023 Industry Flash
9 Wells Fargo, Data Centre REITs: 2024 Outlook

Please confirm your profile
Please confirm your profile to continue
Or select a different profile
Confirm your selection
By clicking on “Continue”, you acknowledge that you will be redirected to the local website you selected for services available in your region. Please consult the legal notice for detailed local legal requirements applicable to your country. Or you may pursue your current visit by clicking on the “Cancel” button.

Welcome to Pictet

Looks like you are here: {{CountryName}}. Would you like to change your location?