Modus

Cool runnings: making data centres more energy efficient

Designers and engineers are finding new ways to keep the world’s ever-growing data centres from overheating, without drastically raising their emissions

Author:

  • Stuart Watson

16 March 2023

Data cloud and blocks covered in an orange to blue hue

Our hunger for computing power is endless. In the digital age, every new application of technology means more data gathered, stored, and transmitted. Every byte of that data has a physical home somewhere within a server in a data centre, and this is an increasingly pressing issue because data centres use a lot of power.

Estimates vary on just how much of the world’s total energy supply is consumed by data centres. The consensus among many experts is between 1-1.5%. In some regions it may be more. In 2020, an EU Commission study estimated that data centre usage equated to 2.7% of total electricity demand in 2018 and predicted that the figure would increase to 3.2% in 2030.

As artificial intelligence, cloud services, edge computing, the internet of things (IoT), and other transformational digital technologies take hold, the amount of power needed, and consequently the carbon footprint of data centres, is likely to grow. To keep power demand from spiralling out of control, data centre operators and developers have increasingly concentrated on making their facilities more efficient.

“The industry is hugely focused on carbon reduction,” says Andrew Jay MRICS, head of data centre solutions for EMEA at CBRE. “The big occupiers of data centres, the ‘hyperscalers’ like Amazon Web Services, Google and Microsoft have very strong emissions reduction commitments. Then the cost side is hugely important to them as well, and not just because energy prices have gone up in the past 12 months. We have been on this path for years.”

Using power effectively

The crucial measure of data centre efficiency is Power Usage Effectiveness (PUE) – the ratio of the total power consumed by a data centre to the power consumed by the IT equipment. The more efficient the facility, the closer the PUE is to 1. Typically, around 40% of a data centre’s total power consumption is used for cooling. Servers produce a lot of heat and must be kept cool to avoid malfunction. Cooling is therefore a key battleground in the struggle to improve efficiency.

The older generation of data centres used traditional air conditioning to cool the air around the server racks. However, this type of equipment is very power-hungry and occupies a lot of space. Over the past decade, designers and engineers have introduced a range of innovations that have gradually pushed PUE downwards. The European Commission report noted an improvement in the average PUE value across the EU from 2.03 in 2010 to 1.75 in 2018. A PUE of 1.5 or 1.6 is now common, and the most efficient data centres are achieving a standard equal to or better than 1.2.

As server technology has advanced, data centres have been able to run at higher temperatures. This has enabled more opportunities to optimise air temperatures and use ‘free cooling’, essentially letting cold air from outside into the data centre. Many modern data centres employ adiabatic cooling, a system which incorporates evaporative cooling – using a mist of water vapour to cool the ambient air – together with air cooling. The system uses a lot of water though, typically more than 500,000 litres per megawatt per year, which can be a drawback in water-scarce environments.

Improving cooling efficiency

There are three main ways to improve cooling efficiency, says John Nicolaou, a UK-based partner at consultant Rider Levett Bucknall, who has overseen cost management on several data centre projects. Allowing the temperature to rise slightly, improving the air flow through a computational fluid dynamics analysis, or decreasing the density of the servers. The trend within the industry is for increased power density, however. “It used to be two, three or four kilowatts per server cabinet,” says Nicolaou. “Now you’re looking at up to 10 kilowatts, and it will continue to increase. High density racks and more power equals more cooling. The trend is for PUE to go down. But cooling still represents a significant chunk of the energy used, and it is a large contributor to the overall carbon footprint of the digital environment.”

Data centre operators are seeking colder environments within which to locate their facilities to take advantage of the opportunity for free cooling. Within Europe, the Nordic countries are increasingly popular, says Ben Stirk MRICS, co-head of global data centres at Knight Frank. “You can open up the sides of the building and have adiabatic cooling all year round, which reduces PUE to a really good level. Norway, Sweden, Finland and Denmark have the cooling advantages, and also cheap renewable power, which builds a good model for the hyperscale data centres. Iceland is also a particularly interesting new market. But they are not suitable for every user.”

While remote and cold areas are favourable for large regional cloud data centres, some facilities need to be located close to the companies or individual accessing the data. On occasion that is because of data protection and data sovereignty regulations. Governments, in particular, like to keep their data within their own borders. Another issue is latency, the speed at which the data can pass from one point to another. Fractions of a second can be crucial for trading floors, as well as for gaming and video streaming services.

It can also be difficult to run construction projects in far-flung locations, says Nicolaou. “They can be a challenge, not just because of the climate, but because you have to get contractors out there, and they must be able to work in those environments. It can be expensive, so clients have to balance the extra capital expenditure with the saving on operational expenditure.”

“The big occupiers of data centres have very strong emissions reduction commitments” Andrew Jay MRICS, CBRE

Aquatic data centres

An even more extreme solution is to immerse the entire data centre in the ocean. In 2018 Microsoft trialled the concept by sinking a pod to the bottom of the sea off the Orkney Islands, retrieving the still working servers two years later. It concluded that the consistently cool subsurface seas allow for energy-efficient data centre designs, for example by employing heat-exchange plumbing such as that found on submarines.

The cutting edge of data centre temperature regulation is liquid cooling technology. Immersion cooling involves submerging the computer hardware in a tub of non-conductive, non-flammable liquid, which absorbs heat much more efficiently than air. Direct-to-chip cooling uses pipes to deliver liquid coolant to a cold plate sitting above a motherboard’s chips.

Neither is yet in widespread use, and the techniques are regarded by some as currently too expensive for all but the most intensive science-related data crunching. Nevertheless, technology company Itrium is currently developing what it claims will be the world’s first fully immersed data centre, together with connected offices, in western Paris. Jessica Starey, an architect at Trace International, is designing the scheme, which she says will feature an ultra-low PUE of just 1.02.

“The advantages of this technology are we don't need water anymore, we don't need a big surface area – it reduces the building footprint by 30% – and there is no noise or dust, so that allows us to bring the data centre next door to other building types,” she says.

Stephane Duclaux MRICS is technical director at public-private economic development body Île de France Investissement et Territoires, which is supporting the project. He says proximity to other uses will allow such new-model data centres to use waste heat to provide for housing or district heating systems. “In future, instead of a few big buildings, there will be lots of smaller data centres, constructed by lots of different builders, not just specialists” he predicts. “They will not only be better in terms of energy usage but having more of them will also make the infrastructure more secure.”

Stirk concludes that the data centre sector can perform a similar test-bed role for real estate to that which Formula 1 does for the automotive industry. “There is so much capital expenditure and technology involved that you have to be pushing the boundaries,” he says. “Why shouldn’t highly efficient cooling technology be replicated for industrial use? You have extremely clever people coming up with ideas and putting them into practice that can be used in other areas of real estate as well.”

“In future, instead of a few big buildings, there will be lots of smaller data centres” Stephane Duclaux MRICS

 

Get Modus features sent straight to your inbox by signing up for the newsletter.

 


Green gauge

Analysing the RICS Sustainability Report

Read more