The following essay is reproduced with permission from The conversationan online publication covering the latest research.
Generative AI is the hot new technology behind chatbots and image generators. But how hot is it on the planet?
Inasmuch as AI researcher, I often worry about the energy costs of building AI models. The more powerful the AI, the more energy it consumes. What does the emergence of increasingly powerful generative AI models mean for society’s future carbon footprint?
“Generative” refers to the ability of an AI algorithm to produce complex data. The alternative is “Discriminating” AI, which chooses between a fixed number of options and produces a single number. An example of a discriminating outcome is whether or not to approve a loan application.
Generative AI can create much more complex outputs, like a sentence, paragraph, image, or even a short video. It has long been used in applications like smart speakers to generate audio responses, or in auto-complete to suggest a search query. However, it is only recently that it has acquired the ability to generate human language and realistic photos.
Use more power than ever
The exact energy cost of a single AI model is difficult to estimate and includes the energy used to manufacture the computing equipment, create the model, and use the model in production. In 2019, researchers discovered that creating a generative AI model called BERT with 110 million parameters consumed the energy of a round-trip transcontinental flight for one person. The number of parameters refers to the size of the model, with larger models generally being more qualified. The researchers estimated that creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tonnes of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for a year. And that’s just to get the model ready for launch, before consumers start using it.
Height is not the only predictor of carbon emissions. Free access BLOOM modeldeveloped by the BigScience Project in France, is similar in size to GPT-3 but has a much lower carbon footprint, consuming 433 MWh of electricity and generating 30 tons of CO2eq. A Google study found that for the same size, using a more efficient model architecture and processor and a greener data center can reduce carbon footprint 100 to 1000 times.
Larger models consume more power when deployed. There is little data on the carbon footprint of a single generative AI query, but some industry figures estimate it four to five times higher than that of a search engine query. As chatbots and image generators become more popular, and Google and Microsoft integrate AI language models in their search engines, the number of queries they receive each day could grow exponentially.
AI robots for research
A few years ago, few people outside of research labs used models like BERT or GPT. This changed on November 30, 2022, when OpenAI released ChatGPT. According to the latest available data, ChatGPT had more than 1.5 billion visits in March 2023. Microsoft has integrated ChatGPT into its search engine, Bing, and made it accessible to everyone on May 4, 2023. If chatbots become as popular as search engines, the energetic costs of deploying AI could really add up. But AI assistants have many more uses than just research, such as writing documents, solving math problems, and creating marketing campaigns.
Another issue is that AI models need to be continuously updated. For example, ChatGPT was only trained on data up to 2021, so it doesn’t know anything about what’s happened since then. The carbon footprint of the creation of ChatGPT is not public information, but it is probably much higher than that of GPT-3. If it had to be recreated regularly to update its knowledge, the energy costs would increase even more.
One benefit is that asking a chatbot can be a more direct way to get information than using a search engine. Instead of getting a page full of links, you get a direct response like you would from a human, assuming the accuracy issues are alleviated. Accessing information faster could potentially offset the increased power consumption compared to a search engine.
Ways forward
The future is hard to predict, but great generative AI models are here to stay, and people will likely turn to them more and more for information. For example, if a student needs help solving a math problem now, they ask a tutor or a friend, or consult a textbook. In the future, they will probably ask a chatbot. The same applies to other specialized knowledge such as legal advice or medical expertise.
Although one great AI model is not going to ruin the environment, if a thousand companies develop slightly different AI robots for different purposes, each used by millions of customers, the energy consumption could become a problem. More research is needed to make generative AI more effective. The good news is that AI can run on renewable energy. By taking the calculation to where green energy is most abundant, or by scheduling the calculation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40compared to using a grid dominated by fossil fuels.
Finally, societal pressure can be useful to encourage companies and research laboratories to publish the carbon footprints of their AI models, as some already do. In the future, consumers may even be able to use this information to choose a “greener” chatbot.
This article was originally published on The conversation. Read it original article.
This is an opinion and analytical article, and the opinions expressed by the author or authors are not necessarily those of American scientist.