Artificial intelligence (AI) has the potential to speed up coding, make driving safer, and reduce the time required for routine operations. However, the inventor of Digiconomist illustrates that the tool, if extensively used, might have a high energy footprint that in the future may exceed the power needs of some nations in a commentary that was published on October 10th, 2023, in the journal Joule.
Image Credit: NicoElNino/Shutterstock.com
Looking at the growing demand for AI service, it is very likely that energy consumption related to AI will significantly increase in the coming years.
Alex de Vries, Study Author and Ph.D. Student, Vrije Universiteit Amsterdam
Since 2022, generative AI has grown quickly, including OpenAI’s ChatGPT, which can generate text, images, or other types of data. It takes a lot of energy to train these AI technologies since a lot of data has to be fed into the models.
Hugging Face, a New York-based AI-developing startup, revealed that during training, their multilingual text-generating AI tool used around 433 megawatt-hours (MWH), which is equivalent to the annual energy use of 40 typical American households.
The energy footprint of AI does not stop with training either. According to De Vries’ investigation, every time the tool creates a text or image or generates data based on prompts, it also requires a sizable amount of computational power and energy. For instance, running ChatGPT can use 564 MWh of power each day.
While businesses all over the world are attempting to make AI hardware and software more energy-efficient, according to de Vries, an improvement in machine efficiency frequently leads to an increase in demand. The Jevons’ Paradox states that, in the end, technological development will result in a net increase in resource consumption.
de Vries added, “The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it.”
For instance, Google has been adding generative AI into its email service and is experimenting with using AI to run its search engine. Currently, the organization handles up to 9 billion queries every day.
According to the statistics, de Vries calculates that if every Google search were to utilize AI, it would require around 29.2 TWh of energy annually, which is equal to Ireland’s yearly electricity usage.
According to de Vries, the significant expenses of adding more AI servers and supply chain constraints make this extreme situation unlikely to occur in the near future. However, in the near future, it is anticipated that the manufacturing of AI servers will increase significantly. The predicted development of AI servers suggests that by 2027, the global power consumption associated with AI might rise by 85 to 134 TWh annually.
The quantity is similar to what Sweden, Argentina, the Netherlands, and other nations consume in terms of power each year. Additionally, if AI becomes more efficient, engineers could be able to repurpose some computer processing circuits for AI purposes, which could result in an even greater rise in the amount of power used for AI.
“The potential growth highlights that we need to be very mindful about what we use AI for. It is energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it,” de Vries concluded.
de Vries, A., et al. (2023) The growing energy footprint of artificial intelligence. Joule. doi:10.1016/j.joule.2023.09.004