Answering the AI ​​generative climate Myth news

In part 2 of our two -part series Impact on the environmental artificial intelligence environmentIN Myth news He examines some of the ways in which experts work on reducing the trace of carbon technology.

It is expected that the demand for the energy of generative artificial intelligence will continue to grow significantly in the next decade.

For example, the report of April 2025 in the International Energy Agency provides that Global electricity demand from data centerswhich are located computer infrastructure for training and implementation of AI models, will be over twice double by 2030, to about 945 four hours. Although not all operations performed in the data center are associated with AI, this total amount is slightly greater than energy consumption in Japan.

In addition, analysis in August 2025 from the Goldman Sachs Research forecasts that about 60 percent of the growing demand for electricity from data centers will be met by burning fossil fuels, growing Global carbon dioxide emissions about 220 million tons. For comparison, driving a gas driven car for 5000 miles produces about 1 ton of carbon dioxide.

These statistics are stunning, but at the same time scientists and engineers in myth and worldwide examine innovations and interventions to relieve the balloon AI coal trace, from increasing the efficiency of algorithms to rethinking the design of data centers.

Considering carbon dioxide emissions

A conversation about reducing the AI ​​generative carbon trail is usually focused on “operational carbon” – emissions used by powerful processors, called GPU, inside the data center. He often ignores the “incorporated coal”, which are emissions created by the construction of a data center, says Vijay Gadepally, a senior scientist from MIT Lincoln Laboratory, who conducts research projects at Lincoln Laboratory Supercomputing Center.

Constructing and modernization of the data center, built of ton of steel and concrete, and filled with air conditioning, computer equipment and a kilometer of the cable, uses a huge amount of coal. In fact, the impact on the environmental environment of data centers is one of the reasons why companies such as companies Finish AND Google They study more sustainable building materials. (Cost is another factor.)

In addition, data centers are huge buildings-the greatest in the world, China Telecomm-Inner Mongolia Information Park, Engulfs About 10 million square feet – adds Gadepally with about 10 to 50 times the energy density of a normal office building.

“The operational side is only part of the history. Some things we are working on to reduce operational emissions can also be used to reduce coal emissions, but in the future we must do more on this front,” he says.

Reduction of operating carbon dioxide emissions

When it comes to reducing the operational emission of carbon dioxide of AI data centers, there are many similarities with energy saving products at home. First of all, we can simply reject the lights.

“Even if you have the worst bulbs at home from the point of view of performance, turning them off or darling will always use less energy than leaving them with a full explosion,” says Gadepally.

In the same way, research from the supercomputer center showed that “rejecting” GPU in the data center, thanks to which they consume about three tenth, energy has a minimum impact on the performance of AI models, while facilitating the cooling of the equipment.

Another strategy is the use of less energy -saving computing equipment.

Requiring generative loads, such as training new reasoning models, such as GPT-5, usually require many GPU processors. Goldman Sachs analysis estimates that the most modern system may soon have as many as 576 connected GPU simultaneously.

But engineers can sometimes achieve similar results by reducing the precision of the equipment for calculating, perhaps by switching to less efficient processors that have been tuned to support a specific AI load.

There are also means that increase the performance of deep learning models before they are implemented.

The Gadepally group said that about half of the electricity used to train the AI ​​model is spent to get the last 2 or 3 percentage points in terms of accuracy. Earlier stopping the training process can save a lot of this energy.

“There may be cases in which 70 % accuracy is sufficient for one specific application, such as the recommending system for electronic trade,” he says.

Scientists can also take advantage of the measures of increasing performance.

For example, Postdoc at the supercomputer center realized that the group could carry out a thousand simulations during the training process to choose two or three best AI models for their project.

By building a tool that allowed them to avoid about 80 percent of the wasted calculation cycles, they radically reduced the energy requirements of training without reducing the accuracy of the model, says Gadepally.

Improvement of use efficiency

Continuous innovations in calculating equipment, such as denser boards of transistor on semiconductor systems, still enables the dramatic improvement of the energy efficiency of AI models.

Although the improvements of energy efficiency slow down for most tokens since around 2005, the number of calculations that the GPU can perform according to Joule of Energy has improved by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech research project at Mit's Computer Science and Artificial Intelligence Laboratory laboratory Myth

“The still occurring trend of” Moore's law “consisting in acquiring an increasing number of transistors in chips is still important for many of these AI systems, because conducting operations is still very valuable for improving performance,” says Thomspon.

Even more significant, research of its group indicate that performance profits from new models architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, doubles every eight or nine months.

Thompson coined the term “Non -non“To describe this effect. In the same way,” Negawatt “represents electricity due to measures to save energy,” negoflop “is a computing operation that does not need to be performed due to algorithmic improvements.

These can be such things as “pruning” unnecessary neural network elements or the use of compression techniques that allow users to have more with smaller calculations.

“If you want to use a really powerful model today to do your task, in just a few years you can use a much smaller model to do the same, which would have much less environmental burden. Facilitating these models is more importantly what you can do to reduce AI environmental costs,” Thompson says.

Maximizing energy savings

Reducing the total energy consumption of AI algorithms and computing equipment will reduce greenhouse gas emissions, not all energy is the same, adds Gadepally.

“The amount of emissions of carbon dioxide at 1 kilowatt hour differs significantly, even during the day, as well as in a month and year,” he says.

Engineers can take advantage of these changes, using the flexibility of AI loads and data center surgery to maximize emission reductions. For example, some AI generative loads do not have to be performed entirely at the same time.

The operation of computers, so some are performed later, when more electricity fed into the mesh comes from renewable sources, such as sunny and wind, can significantly reduce the trace of the coal data center, says Deepjyoti Deka, a scientist of the MIT energy initiative.

DEKA and his team are also studying “smarter” data centers in which the loads of many companies using the same computer equipment are flexible to improve energy efficiency.

“Looking at the system as a whole, we hope to minimize energy consumption, as well as dependence on fossil fuels, while maintaining the standards of reliability for companies and AI users,” says Deka.

He and others in Mitei are building a model of flexibility of the data center, which considers various energy requirements regarding training of deep learning model compared to this model. We hope to discover the best strategies for planning and improving computing operations to improve energy efficiency.

Scientists also examine the use of long -term energy storage units in data centers that store excess energy at a time when it is needed.

After the introduction of these systems, the data center can use the saved energy, which has been generated by renewable sources during a high demand period or avoid the use of diesel backup generators if there are fluctuations in the grid.

“For a long time of energy storage, it can be changing here, because we can design operations that really change the system of system emissions to rely on renewable energy,” says Deka.

In addition, scientists from MIT and Princeton University are developing a software tool for planning investments in the energy sector, called Genxwhich can be used to help companies in determining the ideal place to locate the data center to minimize the impact and cost of the environment.

The location can have a big impact on the reduction of the trace of the coal data center. For example, Meta supports Data Center in LuleaA city on the northern coast of Sweden, where cooler temperatures reduce the amount of electricity needed to cool computer equipment.

Thinking further beyond the box (far), some governments even investigate construction Data centers on the moon where they could be potentially served with almost all renewable energy.

Solutions based on AI

Currently, the expansion of the production of renewable energy here on Earth does not give the step a rapid growth of AI, which is one main blockage of roads to reduce its carbon trail, says Jennifer Turliuk MBA '25, a short -term lecturer, former Sloan scholarship holder and former climate and energy leader of AI at Martin Trust Center for Mit Frontrenership.

Local, state and federal review processes required for new renewable energy projects can take years.

Scientists from MIT and elsewhere study the use of artificial intelligence to speed up the process of combining new renewable energy systems with an energy network.

For example, the AI ​​generative model can improve connections that determine how the new project will affect the power grid, which often takes years.

And when it comes to Acceleration of the development and implementation of clean energy technologyAI can play an important role.

“Machine learning is great to fight complex situations, and the electric mesh is one of the largest and most complex machines in the world,” adds Turliuk.

For example, AI can help optimize the forecasting of solar and wind energy or identify ideal locations for new objects.

It can also be used to perform predictive maintenance and detection of damage for solar panels or other green energy infrastructure or for monitoring the capacity of transmission cables to maximize performance.

By helping researchers in collecting and analyzing huge amounts of data, AI can also inform targeted political interventions aimed at obtaining the largest “rumble for a zloty” from areas such as renewable energy, says Turliuk.

To help decision makers, scientists and enterprises consider multi -faceted costs and benefits of AI systems, she and her colleagues have developed the result of impact on net climate.

The result is a frame that can be used to determine the impact of AI projects on the net climate, taking into account the issue and other environmental costs together with potential environmental benefits in the future.

At the end of the day, the most effective solutions probably result from cooperation between companies, regulatory bodies and researchers, and the Academy led, adds Turliuk.

“It counts every day. We are on a path in which the effects of climate change will not be fully known until it is too late to do anything about it. This is an opportunity once in your life to introduce innovations and make AI systems AI less intense coal,” he says.

LEAVE A REPLY

Please enter your comment!
Please enter your name here