Power, Constrained
Irina Raicu is the director of the Internet Ethics program () at the Markkula Center for Applied Ethics. Views are her own.
Among some of the earliest actions undertaken in his new term, President Trump pulled the (though the withdrawal will not take effect until January 2026), , and announced (or re-announced, since there had been ) a aimed at building new AI-related infrastructure—primarily data centers, and
At the same time, that “40% of existing AI data centers will be operationally constrained by power availability by 2027” and that “short-term power shortages are likely to continue for years as new power transmission, distribution and generation capacity could take years to come online and won’t alleviate current problems.”
At a November conference on AI and the environment co-hosted by the ethics center and last year, researchers noted that are not sufficient to meet the massively growing demand. Investment in such efforts is much needed, and doesn’t match the investments in AI models and tools themselves.
Building massive new data centers, which require more energy, while “surging demand”—as —is "forcing suppliers to increase production by any means possible,” will add to the challenges already posed by climate change.
will also have negative consequences for researchers who are attempting to assess the environmental impact of AI development and deployment—as well as for broader efforts to track climate change.
Much has been written about the role that climate change has played in the massively destructive recent L.A. fires (some of which continue to burn, while ). And climate change, of course, doesn’t impact only people. At least one Pasadena . If the danger to all living creatures isn’t incentive enough, won’t somebody think of the data centers?
Building massive new data centers and the power plants to provide them with energy, while ignoring the climate change implications of such development and pushing back against disclosure requirements is remarkably short-sighted and relies on the still-limited public awareness of the consequences of those choices for all of us.
In September, Governor Newsom’s office published a in both AI development and AI regulation. The announcement noted that the governor
signed legislation requiring California’s Office of Emergency Services to expand their work assessing the potential threats posed by the use of GenAI to California’s critical infrastructure, including those that could lead to mass casualty events…. At the Governor’s direction, Cal OES is working with frontier model companies to analyze energy infrastructure risks and convened power sector providers to share threats and security strategies.
Two days after rescinding the Biden executive order, , which states that “It is the policy of the United States to sustain and enhance America's global AI dominance in order to promote human flourishing, economic competitiveness, and national security.” The order references a definition of “artificial intelligence,” but does not offer a definition of “human flourishing.”
The November conference held at 糖心破解版 was titled “AI and the Environment: Sustaining the Common Good.” What does leadership look like, in both government and business, if the goal is optimizing for the common good?
Image: Catherine Breslin / Better Images of AI / Silicon Closeup / CC-BY 4.0