Water Wars: Cooling the Data Centres

Cooling the Data Centres

Water.  Data centres.  The continuous, pressing need to cool the latter, which houses servers to store and process data, with the former, which is becoming ever more precious in the climate crisis.  Hardly a good comingling of factors.

Like planting cotton in drought-stricken areas, decisions to place data hubs in various locations across the globe are becoming increasingly contentious from an environmental perspective, and not merely because of their carbon emitting propensities.  In the United States, which houses 33% of the globe’s data centres, the problem of water usage is becoming acute.

As the Washington Post reported in April this year, residents in Mesa, Arizona were concerned that Meta’s decision to build another data centre was bound to cause more trouble than it was worth.  “My first reaction was concern for our water,” claimed city council member Jenn Duff.  (The state already has approximately 49 data centres.)

The move to liquid cooling from air cooling for increasingly complex IT processes has been relentless.  As the authors of a piece in the ASHRAE Journal from July 2019 explain, “Air cooling has worked well for systems that deploy processors up to 150 W, but IT equipment is now being manufactured with processors well above 150 W where air cooling is no longer practical.”  The use of liquid cooling was not only more efficient than air cooling regarding heat transfer, but “more energy efficient, reducing electrical energy costs significantly.”  The authors, however, show little concern about the water supplies needed in such ventures.

The same cannot be said about a co-authored study on the environmental footprint of US-located data centres published two years later.  During their investigations, the authors identified a telling tendency: “Our bottom-up approach reveals one-fifth of data center servers’ direct water footprint comes from moderately to highly stressed watersheds, while nearly half of servers are fully or partially powered by power plants located within water stressed reasons.”  And to make things just that bit less appealing, it was also found that roughly 0.5% of total US greenhouse gas emissions could also be attributed to such centres.

Google has proven to be particularly thirsty in this regard, not to mention secretive in the amount of water it uses at its data hubs.  In 2022, The Oregonian/Oregon Live reported that the company’s water use in The Dalles had almost tripled over five years.  The increased usage was enabled, in no small part, because of increased access to the municipal water supply in return for an upgrade to the water supply and a transfer of certain water rights.  Since establishing the first data centre in The Dalles in 2005, Google has also received tax breaks worth $260 million.

The city officials responsible for the arrangement were in no mood to answer questions posed by the inquisitive paper on Google’s water consumption.  A prolonged 13-month legal battle ensued, with the city arguing that the company’s water use constituted a “trade secret”, thereby exempting them from Oregon’s disclosure rules.  To have disclosed such details would have, argued Google, revealed information on how the company cooled their servers to eager competitors.

In the eventual settlement, The Dalles agreed to provide public access to 10 years of historical data on Google’s water consumption.  The city also agreed to pay $53,000 to the Reporters Committee for Freedom of the Press, which had agreed to represent The Oregonian/Oregon Live.  The city’s own costs had run into $106,000.  But most troubling in the affair, leaving aside the lamentable conduct of public officials, was the willingness of a private company to bankroll a state entity in preventing access to public records.  Tim Gleason, former dean of the University of Oregon’s School of Journalism and Communication, saw this distortion as more than just a touch troubling.  “To allow a private entity to essentially fund public advocacy of keeping something out of the public domain is just contrary to the basic intent of the law.”

Instead of conceding that the whole enterprise had been a shabby affront to local residents concerned about the use of a precious communal resource, compromising both the public utility and Google, the company’s global head of infrastructure and water strategy, Ben Townsend, proved benevolent.  “What we thought was really important was that we partner with the local utility and actually transfer those water rights over to the utility in a way that benefits the entire community.”  That’s right, dear public, they’re doing it for you.

John Devoe, executive director of the WaterWatch advocacy group, also issued a grim warning in the face of Google’s ever increasing water use, which will burgeon further with two more data centres promised along the Columbia River.  “If the data center water use doubles or triples over the next decade, it’s going to have serious effects on fish and wildlife on source water streams, and it’s potentially going to have serious effects for other water users in the area of The Dalles.”

Much of the policy making in this area is proving to be increasingly shoddy.  With a global demand for ever more complex information systems, including AI, the Earth’s environment promises to be stripped further.  Information hunger risks becoming a form of ecological license.

 

Dr. Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge.  He currently lectures at RMIT University.  Email: bkampmark@gmail.com

Subscribe to our newsletter for latest news and updates. You can disable anytime.