- AI computing has a sustainability problem.
- Tech is still considered environmentally sound for investors. The truth is much dirtier.
- A UW professor presents solutions from mesh networks to indigenous data sovereignty.
Big Tech's reputation as a safe bet for environmental, social and governance investors and sustainability-minded consumers is clashing with a new reality as industry giants Microsoft, Amazon, Google, and Meta, develop and deploy AI capabilities.
AI is becoming more of an environmental hazard, from power-hungry GPUs used to train models to new datacenters that consume huge amounts of electricity and water.
Amazon's data center empire in north Virginia uses more electricity than the entire grid that runs company's hometown city of Seattle. Google data centers consumed 5.2 billion gallons of water in 2022, up 20% from a year earlier. Meta's Llama 2 model is thirsty, too.
Major players in the race for AI tout how they are offsetting this increased environmental burden with programs such as Microsoft's commitment for their Arizona data centers to use no water for more than half of the year. Or, Google, which recently announced a partnership with AI chip giant Nvidia, and aims to replenish 120% of the freshwater their offices and data centers use by 2030.
These efforts may be mostly clever marketing, according to Adrienne Russell, co-director of the Center for Journalism, Media, and Democracy at the University of Washington.
"There has been this long and concerted effort by the tech industry to make digital innovation seem compatible with sustainability and it's just not," she told Insider.
She cited the shift to cloud computing, and how Apple's offerings are marketed and presented as examples where tech companies sought to be associated with counter-culture, freedom, digital innovation, and sustainability.
This marketing spin is already being used to present AI in a better environmental light.
During Nvidia's second-quarter earnings report in August CEO Jensen Huang presented AI-led "accelerated computing" (what his company sells) as cost and energy-efficient, compared to "general purpose computing" which he suggested was relatively worse for the environment and more expensive.
The data suggests the opposite is true. A recent Cowen research report estimated that AI data centers could require more than five times the power of traditional facilities. GPUs, usually supplied by Nvidia, each consume up to about 400 watts of power, so one AI server can consume 2 kilowatts. While a regular cloud server uses 300 to 500 watts, according to Shaolei Ren, a researcher at UC Riverside who has studied how modern AI models use resources.
"There are things that come carted along with this, not true information that sustainability and digital innovation go hand-in-hand, like 'you can keep growing' and 'everything can be scaled massively, and it's still fine' and that one type of technology fits everyone," Russell said.
The momentum around AI, and its environmental footprint, is likely to grow as companies look to weave large language models into more of their operations.
Russell thinks a better approach would be to focus on other innovations that are more sustainable, such as mesh networks and indigenous data privacy initiatives. Communities are setting up data privacy controls and internet connectivity on their own terms and in ways that don't rely as much on big tech companies.
"If you can pinpoint the examples, however small, of where people are actually designing technology that's sustainable then we can start to imagine and critique these huge technologies that aren't sustainable both environmentally and socially," she said.