Blog Archive

Wednesday, June 5, 2024

AI/Machine Learning is the New Impending Energy Consumer: How Much Will It Affect Electricity Demand?

     Artificial Intelligence and machine learning are here to stay, whether we like it or not, simply because they are useful in many ways and more uses are still being found. Machine learning can find hidden trends in data quickly, pointing scientists, engineers, and others to important relationships and discoveries. AI and ML can diagnose diseases, decipher and translate ancient languages, optimize energy and mineral exploration, optimize manufacturing and industry, and much more. AI/ML can make art, animation, and precision devices. It can edit text and summarize articles. It has a 99.9% chance of destroying humanity says Sam Altman, CEO of OpenAI. I’ve got my doubts about that!  

 

 

A 2022 Study of Google’s Data Center Machine Learning Training Energy Use and Emissions

 

     Around 2020 or a bit before, researchers began estimating the energy use and carbon footprints of AI and ML in earnest. We already knew that the data centers that would be required were big energy users and emitters. A 2022 paper in Computer, ‘The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink’, researchers set out to model the carbon footprint of machine learning training and find ways to keep it minimized. Training the models, a necessary and key feature of them uses the most energy. The paper classifies ML emissions as follows:

 

Operational, the energy cost of operating the ML hardware including data center overheads, or

Lifecycle, which additionally includes the embedded carbon emitted during the manufacturing of all components involved, from chips to data center buildings.

 

The paper focuses on operational emissions. In particular, the study focuses on Google’s machine-learning data centers. In a larger scope. AI/ML emissions are a part of Information and Communications Technology (ICT) emissions, which I wrote about in 2016 here. The paper identified four best practices that contribute to lower emissions, which they call the four M’s: model, machine, mechanization, and maps, which they explain as follows:

 

1. Model. Selecting efficient ML model architectures while advancing ML quality, such as sparse models versus dense modes, can reduce computation by factors of ~5–10.

 

 2. Machine. Using processors optimized for ML training such as TPUs or recent GPUs (e.g., V100 or A100), versus general-purpose processors, can improve performance/Watt by factors of 2–5.

 

 3. Mechanization. Computing in the Cloud rather than on premise improves datacenter energy efficiency, reducing energy costs by a factor of 1.4–2.

 

 4. Map. Moreover, Cloud computing lets ML practitioners pick the location with the cleanest energy, further reducing the gross carbon footprint by factors of 5–105.

 

The following graph from the paper shows how Google was able to apply these best practices to reduce energy consumption by 83 times and CO2 emissions by 747 times!

 

 



     The standard metric of data center efficiency is the Power Usage Effectiveness (PUE) which is “a ratio that describes how efficiently a computer data center uses energy; specifically, how much energy is used by the computing equipment (in contrast to cooling and other overhead that supports the equipment).” PUE became a global standard in 2016. There are some issues with PUE that can complicate comparing the PUE of one facility to another, such as local climate and completeness of all energy sources (i.e. not omitting something like lighting). An ideal PUE is 1. The basic formula to determine PUE is as follows:

 

 


 

     “The average industry datacenter PUE in 2020 was 1.58 (58% overhead) while cloud providers have PUEs of ~1.10

     “The average datacenter carbon emissions in 2020 was 0.429 tCO2e per MWh but the gross CO2e per MWh can be 5x lower in some Google datacenters.”

The number of processors running and the time they are running to perform the training tasks make up the bulk of energy use. This is calculated as follows:

MWh = Hours to train x Number of Processors x Average Power per Processor

MWh = Hours to train x Number of Processors x Average Power per Processor x PUE

tCO2e = MWh x tCO2e per MWh

The development of machine learning is ongoing and energy efficiencies are being improved in many areas. Thus, it is reasonable to assume that overall energy use per a given unit of work will continue to drop and efficiency will continue to increase. Big Tech companies that use the cloud and data centers like Google, Amazon, Microsoft, and Meta are the main players. The paper notes that global data center energy use only increased by 6% between 2010 and 2018, while the number of data centers and processors in them increased by a vastly larger amount. Predictions of a 70% increase over this period did not occur. Thus, these improvements are having an effect. These big data companies are also known for using renewable energy at their data centers as much as possible. Some of them use too much energy for the land around them to support enough solar panels.

     The paper also notes two other concerns about ML energy use and emissions: “the impact of Neural Architecture Search (NAS), which may run t.housands of training runs as part of a single search—potentially exploding overall energy consumption—and ML’s impact on client-side energy usage.” NAS uses computer power to search for and find models with higher quality or efficiency than humans can find. Client-side energy use involves the use of mobile phones that have ML accelerators built in for processes like bar code reading, OCR, face recognition, etc. However, this energy use is estimated to be about 5% of total phone energy use, if that. With billions of phones around the world this usage adds up, with global client-side usage from phones estimated at 0.4 TWh or less in 2021. The authors give several recommendations to reduce energy consumption and emissions. They note in their conclusion:

 

Machine Learning (ML) workloads have rapidly grown in importance, raising legitimate concerns about their energy usage. Fortunately, the real-world energy usage trend of ML is fairly boring. While overall energy use at Google grows annually with greater usage, the percentage for ML has held steady for the past three years, representing <15% of total energy usage. Inference represents about ⅗ of total ML energy usage at Google, owing to the many billion-user services that use ML. GLaM, the largest natural language model trained in 2021, improved model quality yet produced 14x less CO2e than training the previous state-of-the art model from 2020 (GPT-3) and used only 0.004% of Google’s annual energy.”

 

Furthermore, we illustrated that in large scale production ML deployments, minimizing emissions from training is not the ultimate goal. Instead, the combined emissions of training and serving need to be minimized. Approaches like neural architecture search increase emissions but lead to more efficient serving and a strong overall reduction of the carbon footprint of ML. Another perspective is that some consider the carbon footprint to be erased entirely if the cloud provider matches 100% of their energy consumption with renewable energy, as Google and Facebook have done and as Microsoft will soon do.”

 

     The location of data centers also matters for emissions. A 2022 paper in the Proceedings of the ACM Conference on Fairness, Accountability, and Transparency, showed this in the following graph which compared energy use for the same function in 16 different regions.

 




That paper also showed that the time of day of the functions also matters. This is likely due to availability to the grid of solar and wind at certain times of the day.  

     Not all AI processes are equally energy-intensive. Creating an image is far more energy-intensive than generating text. But AI does threaten to delay the aggressive climate goals of the Big Tech companies. Just the additional materials like concrete and steel to build the additional data centers required will add significantly to those carbon footprints.

 

AI/ML Can Also Lead to Better Energy Efficiency in Many Industries

     While AI/ML is set to use more power and put more CO2 in the atmosphere, it can also be used to find ways to increase energy efficiency. According to a 2023 article by Microsoft:

The World Economic Forum underscores the role AI plays in the energy transition and estimates that every 1 percent additional efficiency in demand creates USD1.3 trillion in value between 2020 and 2050 due to reduced investment needs.”

It is no easy task to calculate and account for the energy use and avoided energy use provided by AI and ML. They also improve safety. The article gives several examples where Microsoft Azure data ML, and AI services are helping power generators, mining companies, telecommunications companies, and oil & gas companies optimize their processes for efficiency, improve inspection capabilities, and improve safety. Machine learning models like digital twins have resulted in many of these improvements.

 

The Coming AI Boom

     Remote severs processing away in data centers is the bulk of AI/ML energy use. The International Energy Agency reported that data centers and data transmission networks were responsible for 1% of energy-related GHG emissions in 2023. Data centers as a whole account for 1-1.5% of global electricity use. That usage is set to grow as AI booms. Elizabeth Kolbert, in her article in the New Yorker noted that U.S. data centers now account for about four percent of electricity consumption and is expected to climb to six percent by 2026. It is estimated that NVIDIA will ship 1.5 million AI server units per year by 2027, resulting in 85.4 TWh per year of power consumption. Data scientist Alex DeVries, a Ph.D. candidate in the Netherlands has developed ways to calculate and keep track of data center energy use. He did the same for cryptocurrency energy use, coming up with the Bitcoin Energy Consumption Index. He has also been tracking cryptocurrency water use and e-waste. I should point out here that AI/ML is a useful and net beneficial technology much more important to society than cryptocurrencies, which have many problems in addition to energy use. But both use up energy through the same method: processing power. DeVries points out that simply changing Google’s search engine to ChatGPT involves a massive increase in energy use.

     The two phases of AI/ML energy use are training and inference. Training is simply the initial training of the models. Inference refers to when the model goes live, is fed prompts, and gives responses. With Google the ratio was 60% inference and 40% training but that could vary among companies and task focuses. ChatGPT is powered by large language models that utilize huge datasets with billions of parameters. Cooling of the servers is expected to add 10-50% to total energy usage. It is also true that efficiency improvements enabled by AI/ML will offset a portion of energy use, but those numbers are not easy to predict. Of course, efficiency improvements can also increase demand for the service so that has to be accounted for as well. The 2022 paper above suggests that AI/ML energy use will grow, plateau, and then shrink, but they did not give a time frame. It is estimated that ChatGPT is already responding to about 200 million requests per day, the energy generated by about 17,000 households. DeVries noted that it took a long time to require cryptocurrency companies to disclose their energy use and he is disappointed AI/ML energy use has not developed disclosure requirements faster, considering that we already know they are needed.

 

What Will Produce the Electricity that Powers the AI Boom?

     It is true that the Big Data companies have been exemplary in using as much renewable energy as they can to power their data centers. However, there are limitations that they cannot overcome. Land availability is a big issue with solar and wind. Some could utilize or sell the waste heat that is generated in their operations to offset energy use elsewhere. Cryptocurrencies quested for and opted for cheap energy to mine the coins. That is less likely to be the case with AI/ML since Big Tech has already shown commitments to sustainable energy use. It has been estimated that a new data center is built every three days. It has also been acknowledged that renewable energy alone will not be enough to power the AI/ML boom. Once again, good old natural gas is emerging as a less than ideal, but pragmatic energy source. New data centers mean more electricity demand wherever they are built, and power authorities and grid operators need to plan for that. This new demand is resulting in some older coal and gas plants delaying their scheduled retirements in the name of maintaining grid reliability.  

     Governments are actively working to develop reporting standards for AI/ML impacts. The International Organization for Standardization is involved as a Yale Environment 360 article points out:

 

Those will include standards for measuring energy efficiency, raw material use, transportation, and water consumption, as well as practices for reducing A.I. impacts throughout its life cycle, from the process of mining materials and making computer components to the electricity consumed by its calculations. The ISO wants to enable A.I. users to make informed decisions about their A.I. consumption.”

 

As DeVries suggested, this should have been done a few years ago, so we are a bit behind.

     While energy demand in the U.S. has been flat for a decade that is expected to change. A 20% increase is expected by 2030 due to ICT and AI demand, electrification, and EV demand. AI data centers alone are expected to add about 323 TWh by 2030. All data centers could represent up to 8% of U.S. energy demand by 2030. That is quite a lot. As mentioned, natural gas is emerging as a solution. While it is more carbon-intensive than renewables, it is both more dispatchable and cheaper. Estimates are that natural gas use could climb by 28% by 2030, adding about 10BCF/day of new natural gas demand. The U.S. as the largest producer of natural gas is well situated to provide it. The U.S. Southeast is expected to be a major AI data center hub. Pipelines can bring it from the south or from the north. Obama’s energy secretary Ernest Moniz noted that renewables will not be able to keep up:

 

We’re not going to build 100 gigawatts of new renewables in a few years,” Moniz said.

 

Indications are that tech companies and natural gas companies are consulting with each other on these matters. The inability of tech companies to power their facilities with renewables due to things like land constraints has also led to them using more and more carbon offsets to meet their goals. The competition between tech companies to develop AI is leading them to move quickly as well.

 

 

What Can We Do to Mitigate These New Emissions Sources?

 

     As noted, the tech companies continue to work on process efficiencies for decreasing AI/ML energy consumption. The IEA lists eight recommendations going forward:

 

1)        Improve data collection and transparency

 

2)        Enact policies to encourage energy efficiency, demand response and clean energy procurement

 

3)        Support the utilisation of waste heat from data centres

 

4)        Collect and report energy use and other sustainability data

 

5)        Commit to efficiency and climate targets and implement measures to achieve them

 

6)        Increase the purchase and use of clean electricity and other clean energy technologies

 

7)        Invest in RD&D for efficient next-generation computing and communications technologies

 

8)        Reduce life cycle environmental impacts

 

References:

Power usage effectiveness. Wikipedia. Power usage effectiveness - Wikipedia

Natural Gas: A Natural Bridge to Fuel AI’s Electric Demand. Paul Hoffman. TipRanks. MSN. Money Markets. Natural Gas: A Natural Bridge to Fuel AI’s Electric Demand (msn.com)

AI is an energy hog. This is what it means for climate change. Casey Crownhart. MIT Technology Review. May 23, 2024. AI is an energy hog. This is what it means for climate change. | MIT Technology Review

The AI Boom Could Use a Shocking Amount of Electricity. Lauren Leffer. Scientific American. October 13, 2023. The AI Boom Could Use a Shocking Amount of Electricity | Scientific American

The growing energy footprint of artificial intelligence. Alex de Vries. Joule. Volume 7, Issue 10, 18 October 2023, Pages 2191-2194. The growing energy footprint of artificial intelligence - ScienceDirect

The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink. David Patterson, Joseph Gonzalez, Urs Hölzle, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, Jeff Dean. Computer. 2022. 2204.05149 (arxiv.org)

The Obscene Energy Demands of A.I. How can the world reach net zero if it keeps inventing new ways to consume energy? Elizabeth Kolbert, The New Yorker. March 9, 2024. The Obscene Energy Demands of A.I. | The New Yorker

As Use of A.I. Soars, So Does the Energy and Water It Requires. David Berreby. February 6, 2024. Yale Environment 360. As Use of A.I. Soars, So Does the Energy and Water It Requires - Yale E360

The era of AI: Transformative AI solutions powering the energy and resources industry. Darryl Willis, Corporate Vice President, Worldwide Energy and Resources Industry. Microsoft. September 28, 2023. The era of AI: Transformative AI solutions powering the energy and resources industry - Microsoft Industry Blogs

Data Centres and Data Transmission Networks. International Energy Agency. Data centres & networks - IEA

Measuring the Carbon Intensity of AI in Cloud Instances. Dodge, etal. FAccT '22: Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency. June 2022Pages 1877–1894. 3531146.3533234 (acm.org)

AI companies eye fossil fuels to meet booming energy demand. Mack DeGeurin. Popular Science. March 25, 2024. AI companies eye fossil fuels to meet booming energy demand | Popular Science (popsci.com)

 

No comments:

Post a Comment

     The SCORE Consortium is a group of U.S. businesses involved in the domestic extraction of critical minerals and the development of su...

Index of Posts (Linked)