Sign In

Communications of the ACM


The Carbon Footprint of Artificial Intelligence

hand on knob of CO2 scale

Credit: Olivier Le Moal

The growing utilization of artificial intelligence (AI) is apparent across all facets of society, from the models used to enable semi-autonomous cars, to models that serve up recommendations on streaming or e-commerce sites, and in the language models used to create more natural, intuitive human-machine interaction. However, these technological achievements come with costs, namely the massive amounts of electrical power required to train AI algorithms, build and operate the hardware on which these algorithms are run, and to run and maintain that hardware throughout its life cycle.

The cost of the electricity is not the only impact; traditional power plants that use fossil fuels (as well as some geothermal processes) to create power emit relatively high amounts of carbon dioxide (CO2) as they generate electricity, compared with renewable energy sources such as solar, wind, or nuclear plants, which do not. That emitted CO2 has a direct impact on the environment. (See Andrew Chien's column on p. 5.)

Back to Top

Why It Is Difficult to Quantify Carbon Footprints

While all software has a carbon footprint—the amount of CO2 directly related to its use—large and complex AI models have a significant environmental cost and are increasingly coming under scrutiny. AI algorithms are trained and run in AI datacenters, which are responsible for a certain amount of CO2 emissions based on the amount of compute time and the type of graphics processing units (GPUs) which impose different amounts of energy consumption, based on data parallelism, memory use, and performance levels), the source of the electricity, and the amount of electricity used in the construction and deployment of the compute infrastructure. In addition to the electricity consumed and CO2 generated while developing or using an algorithm, the processes used to extract the raw materials from the Earth that are used in the hardware and data-centers on which AI algorithms run; the manufacturing processes used to build, assemble, transport, and install this hardware; and the cost of disposing of this hardware once it has exhausted its lifespan all contribute to an AI algorithm's overall carbon footprint.

"The safest way to consider AI's carbon footprint is simply to consider it as a subset of the overall ICT (information and communications technology) carbon footprint, which is broadly estimated to be between 1.8% to 2.8% as of 2020," says Matt Warburton, principal consultant and sustainability lead with global technology research and advisory firm ISG, who referenced a 2021 Patterns article that also noted that other estimates of AI's carbon footprint range from 2.1% to 3.9% of the total share of greenhouse gas (GHG) emissions. "The impacts are therefore modest in comparison with more polluting industries such as manufacturing and transportation," Warburton says. The U.S. Environmental Protection Agency (EPA) notes that transportation (27%) and industry (24%) accounted for significant portions of total GHG emissions in 2020 (

This aggregate approach does not consider each individual model's carbon footprint, which can be more difficult to accurately pin down, largely due to a lack of detailed data on the energy consumption of many large AI models. However, AI startup Hugging Face released a paper ( that calculated the carbon footprint of one of its own AI models, taking into consideration the emissions produced during the model's whole life cycle, rather than just during training.

The Hugging Face paper estimated overall emissions for its own large language model, BLOOM, which involved calculating or estimating a series of variables, including the amount of energy used to train the model on a supercomputer, the energy needed to manufacture the supercomputer's hardware and maintain its computing infrastructure, and the energy used to run BLOOM once it had been deployed. The researchers calculated that final part using a software tool called CodeCarbon (, which tracked the CO2 emissions BLOOM produced in real time over a period of 18 days.

Hugging Face estimated BLOOM's training resulted in 25 metric tons of CO2 emissions, a figure that doubled to 50 metric tons of CO2 emissions when accounting for the emissions produced by the manufacturing of the computer equipment used for training, the broader computing infrastructure, and the energy required to run BLOOM once it was trained.

While it is possible to use faster or more efficient hardware to reduce the training time for AI models, the overall size and complexity of a model being trained has the greatest impact on the amount of CO2 emissions. For one, large models generally consume more electricity than smaller models because they are more complex and require a greater amount of compute training time. For example, GPT-1 (June 2018) includes about 0.12 billion parameters; GPT-2 (February 2019) includes about 1.5 billion; and GPT-3 (May 2020) includes about 175 billion, with the carbon impact rising as the complexity increases. However, this can be mitigated through several techniques.

The overall size and complexity of an AI model being trained has the greatest impact on the amount of CO2 emissions.

Compression, a technique that reduces the bit width of each parameter included in the model, can reduce the model's size and reduce energy consumption. Other techniques include data quantization and pruning (which removes redundant parameters and connectors from a model), distillation (which trains a small model with the knowledge of a larger model to create a smaller, more efficient model), and transfer learning (in which a large model's training is jump-started by overlaying learnings used in a smaller model). All these techniques were designed to reducing training time, which reduces the amount of electricity consumed.

The biggest factor in determining CO2 emission levels is the source of the electricity. Hugging Face's BLOOM model was trained on a French supercomputer that is mostly powered by nuclear energy, which does not produce carbon dioxide emissions. Models that are trained in regions where the energy grids largely rely on fossil fuels are likely to be far more polluting and will have much larger carbon footprints. In comparison, Open Al's GPT-3 and Meta's OPT algorithms were estimated to emit more than 500 and 75 metric tons of carbon dioxide, respectively, during training (although GPT-3's far higher CO2 emissions can be partially explained by the fact the algorithm was trained on older, less-efficient hardware).

Other researchers also have published papers highlighting the relatively large carbon footprints of AI models. The Allen Institute for AI and Microsoft, working with colleagues from Hebrew University, Carnegie Mellon University, and Hugging Face, measured the operational carbon emissions of Azure AI workloads by multiplying the energy consumption of a cloud workload by the carbon intensity of the grid that powers the datacenter. The researchers found the six-billion-parameter language model emitted more CO2 than the average U.S. home does in a year, even though it was only trained to 13% of the time it would take to reach full training capacity. If the model was trained to completion, the Allen Institute for AI estimates a full training run would "emit about an order of magnitude more emissions." research community, along with commercial AI developers, are more focused on performance than on sustainability, which worries some industry observers.

"When AI developers create new systems, they're trying to push performance, accuracy, or the model's capabilities, and they focus less on efficiency," says Jesse Dodge, a research scientist at the Allen Institute for AI. "With some of the more recent models that have come out of for-profit AI research and development companies, there is a purposeful lack of transparency. They don't want to tell the public about what they've done to train those models and the number of GPU hours that have gone into them, because that provides them a competitive advantage."

Dodge says the biggest impediment to understanding AI's carbon footprint is this lack of transparency, which he believes will require some sort of legislation around openness or emission reduction to overcome. "I think it's going to take a bit of time, but I do expect we will see some regulation at some point," Dodge says. "The European AI Act considered including information like requirements on transparency around CO2 emissions, but [that] did not end up in the final published bill."

In addition, other than the scientific research community, there appears to be a lack of general awareness of the impact of AI on carbon emissions. "I think that people just see AI to be this intangible thing that's often in the cloud," says Sasha Luccioni, a research scientist at Hugging Face. "I think it's like a cognitive bias aspect as well; 'Siri doesn't have a physical form, so how could it have an environmental footprint?'"

Luccioni highlights a key difficulty in assessing an algorithm's true footprint, noting that beyond the opacity of data scientists, the suppliers of computing infrastructure, including GPU manufacturers, datacenter hardware providers, and all their component suppliers, tend to pass the buck on CO2 emission responsibility.

"I've spoken to people [at hardware vendors] who say, 'Well, you know, we've got thousands of suppliers making hundreds of products'," Luccioni says. "So it's really hard for them to track down each one and say, "where's your sustainability report?'"

Dodge says the biggest impediment to understanding AI's carbon footprint is the lack of transparency of for-profit AI development companies.

Beyond tracking or calculating the carbon footprint of AI models, the experts interviewed for this story agreed that it is more important to consider the steps that should be taken to reduce whatever impact AI will have on the environment.

Buying carbon offset credits, while useful in reducing a model's carbon footprint on paper, does not truly reduce the amount of carbon emitted, Luccioni says, noting that, "if you're using a very carbon-intensive energy center to train a model for two million hours and then you're buying a bunch of renewable energy credits, it's not quite the same."

Warburton says the end-use of an AI model, along with whether it is focused on sustainability and the public good, should be considered before building and training a model. "Assessing the sustainability of AI should be considered in context of the outcome it produces," Warburton says. "AI that accelerates discovery of chemical compounds that halve emissions of concrete, without unacceptable compromises, is clearly more sustainable than using AI to mine cryptocurrencies."

Dodge agrees, noting that some of the big oil companies use AI systems to increase their rates of oil extraction. "You can train it as efficiently as you want," Dodge says, but "if you're then using that AI to extract more oil from the ground, then that's going to lead to more emissions in the long run."

The solutions that are perhaps easiest to enact are choosing to train an algorithm on a cloud service that is in a region with low carbon intensity, and right-sizing the number of parameters used in the algorithm. Luccioni says some AI developers will simply choose to train their models at the most powerful datacenters available, even if their model does not need that kind of compute horsepower. Many of these datacenters are in the U.S. and other countries that rely heavily on fossil fuels for power generation, and as such, will result in a model with a large carbon footprint.

"Currently, people don't necessarily think about that when they're launching like a training job," Luccioni says. In addition, she says, "there is some research that shows that the bigger the algorithm is, the better it is, but that's not conclusive. It's not like adding a billion parameters is going to add a 1% improvement in accuracy."

Dodge concurs, noting that in his research, a lot of the CO2 emissions were calculated from the electricity consumed in training the model. "Choosing the region that you're training your model in or putting your model into production can have a pretty big impact," Dodge says. "We found that training in the least carbon-intense region, compared with the most carbon-intense region, could reduce your emissions by about two-thirds, to one-third of what the full emissions would've been."

* Further Reading

Luccioni, A.S. et al.
Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model. November 2022.

Frietag, C. et al.
The real climate and transformative impact of ICT: A critique of estimates, trends, and regulations. Patterns, Volume 2, Issue 9, September 2021.!

Buchanan, W. and Dodge, J.
Measuring and Mitigating Carbon Intensity Allen Institute for AI, June 13, 2022,

Video: Chip companies are more concerned with raw speed than carbon footprint:

Back to Top


Keith Kirkpatrick is principal of 4K Research & Consulting, LLC, based in New York, NY, USA.

©2023 ACM  0001-0782/23/8

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.


No entries found