×

注意!页面内容来自https://news.mit.edu/2025/responding-to-generative-ai-climate-impact-0930,本站不储存任何内容,为了更好的阅读体验进行在线解析,若有广告出现,请及时反馈。若您觉得侵犯了您的利益,请通知我们进行删除,然后访问 原网页

Skip to content ↓

Responding to the climate impact of generative AI

Explosive growth of AI data centers is expected to increase greenhouse gas emissions. Researchers are now seeking solutions to reduce these environmental harms.

Press Contact:

MIT Media Relations
Phone: 617-253-2700
Close
A data center overlooking a greensunny landscape
Caption:
“We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it,” says Jennifer Turliuk MBA ’25who is working to help policymakersscientistsand enterprises consider the multifaceted costs and benefits of generative AI. “This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense.”
Credits:
Credit: iStock

Audio

In part 2 of our two-part series on generative artificial intelligence’s environmental impactsMIT News explores some of the ways experts are working to reduce the technology’s carbon footprint.

The energy demands of generative AI are expected to continue increasing dramatically over the next decade.

For instancean April 2025 report from the International Energy Agency predicts that the global electricity demand from data centerswhich house the computing infrastructure to train and deploy AI modelswill more than double by 2030to around 945 terawatt-hours. While not all operations performed in a data center are AI-relatedthis total amount is slightly more than the energy consumption of Japan.

Moreoveran August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuelsincreasing global carbon emissions by about 220 million tons. In comparisondriving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.

These statistics are staggeringbut at the same timescientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI’s ballooning carbon footprintfrom boosting the efficiency of algorithms to rethinking the design of data centers.

Considering carbon emissions

Talk of reducing generative AI’s carbon footprint is typically centered on “operational carbon” — the emissions used by the powerful processorsknown as GPUsinside a data center. It often ignores “embodied carbon,” which are emissions created by building the data center in the first placesays Vijay Gadepallysenior scientist at MIT Lincoln Laboratorywho leads research projects in the Lincoln Laboratory Supercomputing Center.

Constructing and retrofitting a data centerbuilt from tons of steel and concrete and filled with air conditioning unitscomputing hardwareand miles of cableconsumes a huge amount of carbon. In factthe environmental impact of building data centers is one reason companies like Meta and Google are exploring more sustainable building materials. (Cost is another factor.)

Plusdata centers are enormous buildings — the world’s largestthe China Telecomm-Inner Mongolia Information Parkengulfs roughly 10 million square feet — with about 10 to 50 times the energy density of a normal office buildingGadepally adds. 

“The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbontoobut we need to do more on that front in the future,” he says.

Reducing operational carbon emissions

When it comes to reducing operational carbon emissions of AI data centersthere are many parallels with home energy-saving measures. For onewe can simply turn down the lights.

“Even if you have the worst lightbulbs in your house from an efficiency standpointturning them off or dimming them will always use less energy than leaving them running at full blast,” Gadepally says.

In the same fashionresearch from the Supercomputing Center has shown that “turning down” the GPUs in a data center so they consume about three-tenths the energy has minimal impacts on the performance of AI modelswhile also making the hardware easier to cool.

Another strategy is to use less energy-intensive computing hardware.

Demanding generative AI workloadssuch as training new reasoning models like GPT-5usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.

But engineers can sometimes achieve similar results by reducing the precision of computing hardwareperhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.

There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.

Gadepally’s group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.

“There might be cases where 70 percent accuracy is good enough for one particular applicationlike a recommender system for e-commerce,” he says.

Researchers can also take advantage of efficiency-boosting measures.

For instancea postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.

By building a tool that allowed them to avoid about 80 percent of those wasted computing cyclesthey dramatically reduced the energy demands of training with no reduction in model accuracyGadepally says.

Leveraging efficiency improvements

Constant innovation in computing hardwaresuch as denser arrays of transistors on semiconductor chipsis still enabling dramatic improvements in the energy efficiency of AI models.

Even though energy efficiency improvements have been slowing for most chips since about 2005the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each yearsays Neil Thompsondirector of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT’s Initiative on the Digital Economy.

“The still-ongoing ‘Moore’s Law’ trend of getting more and more transistors on chip still matters for a lot of these AI systemssince running operations in parallel is still very valuable for improving efficiency,” says Thomspon.

Even more significanthis group’s research indicates that efficiency gains from new model architectures that can solve complex problems fasterconsuming less energy to achieve the same or better resultsis doubling every eight or nine months.

Thompson coined the term “negaflop” to describe this effect. The same way a “negawatt” represents electricity saved due to energy-saving measuresa “negaflop” is a computing operation that doesn’t need to be performed due to algorithmic improvements.

These could be things like “pruning” away unnecessary components of a neural network or employing compression techniques that enable users to do more with less computation.

“If you need to use a really powerful model today to complete your taskin just a few yearsyou might be able to use a significantly smaller model to do the same thingwhich would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI,” Thompson says.

Maximizing energy savings

While reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissionsnot all energy is the sameGadepally adds.

“The amount of carbon emissions in 1 kilowatt hour varies quite significantlyeven just during the dayas well as over the month and year,” he says.

Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instancesome generative AI workloads don’t need to be performed in their entirety at the same time.

Splitting computing operations so some are performed laterwhen more of the electricity fed into the grid is from renewable sources like solar and windcan go a long way toward reducing a data center’s carbon footprintsays Deepjyoti Dekaa research scientist in the MIT Energy Initiative.

Deka and his team are also studying “smarter” data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.

“By looking at the system as a wholeour hope is to minimize energy use as well as dependence on fossil fuelswhile still maintaining reliability standards for AI companies and users,” Deka says.

He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.

The researchers are also exploring the use of long-duration energy storage units at data centerswhich store excess energy for times when it is needed.

With these systems in placea data center could use stored energy that was generated by renewable sources during a high-demand periodor avoid the use of diesel backup generators if there are fluctuations in the grid.

“Long-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy,” Deka says.

In additionresearchers at MIT and Princeton University are developing a software tool for investment planning in the power sectorcalled GenXwhich could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.

Location can have a big impact on reducing a data center’s carbon footprint. For instanceMeta operates a data center in Luleaa city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.

Thinking farther outside the box (way farther)some governments are even exploring the construction of data centers on the moon where they could potentially be operated with nearly all renewable energy.

AI-based solutions

Currentlythe expansion of renewable energy generation here on Earth isn’t keeping pace with the rapid growth of AIwhich is one major roadblock to reducing its carbon footprintsays Jennifer Turliuk MBA ’25a short-term lecturerformer Sloan Fellowand former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.

The localstateand federal review processes required for a new renewable energy projects can take years.

Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.

For instancea generative AI model could streamline interconnection studies that determine how a new project will impact the power grida step that often takes years to complete.

And when it comes to accelerating the development and implementation of clean energy technologiesAI could play a major role.

“Machine learning is great for tackling complex situationsand the electrical grid is said to be one of the largest and most complex machines in the world,” Turliuk adds.

For instanceAI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.

It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructureor to monitor the capacity of transmission wires to maximize efficiency.

By helping researchers gather and analyze huge amounts of dataAI could also inform targeted policy interventions aimed at getting the biggest “bang for the buck” from areas such as renewable energyTurliuk says.

To help policymakersscientistsand enterprises consider the multifaceted costs and benefits of AI systemsshe and her collaborators developed the Net Climate Impact Score.

The score is a framework that can be used to help determine the net climate impact of AI projectsconsidering emissions and other environmental costs along with potential environmental benefits in the future.

At the end of the daythe most effective solutions will likely result from collaborations among companiesregulatorsand researcherswith academia leading the wayTurliuk adds.

“Every day counts. We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense,” she says.

Related Links

Related Topics

Related Articles

More MIT News