[Adam Wierman and Shaolei Ren in IEEE Spectrum]
An interesting finding on the energy use implicit in training and offering AI services. I do think some of these principles could apply to all of cloud computing - it’s out of sight and out of mind, but certainly uses a great deal of power. Still, there’s no doubt that AI isn’t exactly efficient, and as detailed below, is a significant contributor to increased energy use and its subsequent effects.
“[…] Many people haven’t made the connection between data centers and public health. The power plants and backup generators needed to keep data centers working generate harmful air pollutants, such as fine particulate matter and nitrogen oxides (NOx). These pollutants take an immediate toll on human health, triggering asthma symptoms, heart attacks, and even cognitive decline.
According to our research, in 2023, air pollution attributed to U.S. data centers was responsible for an estimated $6 billion in public health damages. If the current AI growth trend continues, this number is projected to reach $10 to $20 billion per year by 2030, rivaling the impact of emissions from California’s 30 million vehicles.”
These need to be taken into account. It’s not that we should simply stop using technology, but we should endeavor to make the software, hardware, and infrastructure that supports it to be much more efficient and much lower impact.
[Link]
·
Links
·
Share this post
Werd I/O © Ben Werdmuller. The text (without images) of this site is licensed under CC BY-NC-SA 4.0.
I’m writing about the intersection of the internet, media, and society. Sign up to my newsletter to receive every post and a weekly digest of the most important stories from around the web.