Dr. Ross Dr. Ross

Artificial Intelligence Energy Bill: Can We Pay It?

Energy Management with AI

Time will tell if people can sculpt Artificial Intelligence (AI) and Large Language Models (LLM) into a net “positive” for humanity and the planet; meanwhile, the debate continues. If you want to weigh in yourself on what’s going on, check out the open access peer reviewed commentary below:

https://doi.org/10.1016/j.joule.2023.09.004

The article discusses the rapid expansion of AI during 2022 and 2023, fueled by the success of OpenAI's ChatGPT. Major tech companies like Alphabet and Microsoft increased their support for AI, raising concerns about its electricity consumption and environmental impact. The commentary delves into the energy consumption of AI, particularly focusing on the energy-intensive training phase and the relatively less-explored inference phase.

 

Interestingly, the energy demands of AI is discussed in the commentary. During the training phase, AI models consume significant electricity, with examples like Hugging Face's BLOOM model using 433 MWh and GPT-3 using 1,287 MWh for training. The inference phase, where models generate responses to user queries, is noted to potentially contribute significantly to an AI model's life-cycle costs. Google reported that 60% of AI-related energy consumption from 2019 to 2021 came from inference.

 

The article explores potential future energy footprints, projecting scenarios where popular applications like Google Search integrate generative AI, potentially leading to substantial (tenfold) electricity consumption. Worst-case scenarios suggest that Google's AI alone could consume as much electricity as a country like Ireland (29.2 TWh annually – enough power to fully charge more than 580,000,000 electric cars). Are cities equipped to deliver that much power? PLUS, are we equipped to cool down all those CPUs and GPUs? Practical constraints such as production capacity and investment costs are likely to prevent such rapid adoption – a potential bottleneck to AI growth.

 

In summary, the commentary cautions against overly optimistic or pessimistic expectations regarding AI-related electricity consumption. It emphasizes the need to balance AI advancements with considerations of necessity, suggesting that improvements in hardware and software efficiencies may not fully offset the long-term environmental impact. The article also advocates for enhanced transparency in the AI supply chain to better understand the emerging technology's environmental costs.

Read More