Water Industry News

How Can Water Be Conserved While Meeting AI Innovation Demand?

Earlier this year, we blogged about how essential data centres are to facilitate the level of connectivity that’s required for 21st century life to continue as we know it, as well as driving the emergence of innovative new technology.

 

The likes of servers, storage devices, networking equipment and so on are all absolute necessities these days, managing and distributing vast swathes of data that is transferred between different devices and locations on a global scale every second of every day.

 

But this level of connectivity certainly doesn’t come without a price and these data centres generate huge amounts of heat in order to operate properly, which means they need to be equipped with advanced cooling systems such as industrial air conditioning and ventilation, and liquid cooling, to maintain constant temperatures and prevent overheating.

 

Not only does this use a significant amount of energy to facilitate, but billions of cubic metres of water is required to keep these centres up and running… and with the advent of artificial intelligence (AI), data centre water usage and consumption is only going to increase, at a time when natural resources are being put under increasing pressure from climate change, population growth, water mismanagement and pollution.

 

A recent report from non-profit organisation China Water Risk found that as AI adoption continues to increase and services like chatbots become de rigueur, more than 20 times the current amount of water will be necessary to power these facilities and others like them.

 

As such, the question must now be asked: How exactly can water be conserved during data centre operations while ensuring that growing customer demand for AI innovation and cloud technology is met?

 

Does Microsoft have the answers?

 

At the start of June, tech giant Microsoft published its data centre community pledge to build and operate digital infrastructure that tackles societal challenges head on while creating benefits and opportunities for local communities.

 

The pledge focuses on three key areas to contribute to a sustainable future: running carbon negative, water positive and zero waste data centres before 2030, advancing community prosperity and wellbeing, and partnering closely with communities to operate in a way that respects the local environment.

 

To help it achieve its promise, Microsoft is now asking just how water can continue to be saved as AI adoption continues to grow.

 

It explains that the last few years have seen significant growth in AI applications and a surge in demand for high performance cloud capabilities, which has increased the power requirements for silicon chips in data centres.

 

As these chips use more power, they also generate more heat, which means that more intensive cooling is required – and more water is consumed.

 

Now that Microsoft has pledged to become water positive by 2030 – which means that it will need to put more water back into the natural environment than it abstracts – it is looking to see what innovations can be embraced to reduce the water required for its data centres.

 

Since its first generation of owned data centres fired up back in the early 2000s, the company has succeeded in reducing its water intensity (water consumed per kilowatt-hour) by more than 80 per cent.

 

To achieve this, Microsoft has worked to minimise the amount of water required for cooling across all its locations, including operating data centres at temperatures that enable outdoor air to be used for cooling for the majority of the year, which drives down the need for ambient cooling and helps save water day by day.

 

Regular audits of its centres are carried out to identify weak and inefficient areas, with its 2022 review leading to targeted improvements that successfully eliminated 90 per cent of instances where excess water was being consumed.

 

Furthermore, advanced prediction models are now being built that will help Microsoft anticipate its water requirements based on incoming operational and weather data in real time.

 

Tailoring its conservation strategies to its specific locations is another tactic that’s proving beneficial, such as in Texas, Singapore, Washington and California, where use of reclaimed and recycled water has been expanded. Meanwhile, in Ireland, Sweden and the Netherlands, rainwater harvesting is the priority.

 

Elsewhere, innovative cooling technologies are now being adopted as a key part of the brand’s water strategy, such as cold plates where direct-to-chip cooling technology provides heat exchange in a closed loop system.

 

This way of working dissipates heat more effectively than traditional air cooling, providing the silicon with direct chilling and then recirculating the cooling fluid, much like a car radiator works.

 

To help make this way of working even more efficient, a new generation of data centre designs is now being developed that is optimised for direct-to-chip cooling. This involves changing the layout of servers and racks to make room for new thermal and power management methods.

 

In existing centres, sidekick liquid cooling systems are now being brought in to circulate fluid that draws heat away from the cold plates attached to the surface of the chips.

 

As Microsoft observes: “Our newest data centre designs are optimised to support AI workloads and consume zero water for cooling. To achieve this, we’re transitioning to chip-level cooling solutions, providing precise temperature cooling only where it’s needed and without requiring evaporation.

 

“With these innovations, we can significantly reduce water consumption while supporting higher rack capacity, enabling more compute power per square foot within our data centres.”

 

Are you inspired to reduce your water footprint?

 

If you’d like to follow in Microsoft’s footsteps and do more to become water positive in the future, get in touch with the SwitchWaterSupplier.com team to find out more about water efficiency and what can be achieved in this regard.