What is the energy impact of Artificial Intelligence?
Although the origins of Artificial Intelligence date back to the last century, the recent wide diffusion of disruptive AI solutions, such as ChatGPT or Midjourney, has led this technology to become increasingly popular and commonly used by everybody. Hence, Artificial Intelligence has become one of the driving forces of technological innovation, a key factor to revolutionize many industries like healthcare, transportation and finance.
The constant growth of the capabilities of AI in all its applications has opened several debates on the possible issues that this technology could bring: above all, one of the most important and interesting refers to the energy impact of AI and its cost in terms of sustainability. As a matter of fact, experts highlighted that the computing power needed to run Machine Learning algorithms and processes could severely impact on climate change, considering the greenhouse gas emissions due to the large amount of electricity required.
Moreover, every single online interaction is based on a scaffolding of information stored in remote servers and data centers that are located around the world and use a large amount of energy to function. According to the International Energy Agency, data centers currently account for 1 up to 1.5 percent of global electricity consumption. Artificial Intelligence, which has not yet reached its peak, could significantly increase this percentage as the so-called Large Language Models (LLMs), the language models trained on vast text datasets to handle such complex tasks, become increasingly stressed; so, this consequently entails that the demand for servers to process models grows exponentially, requiring an ever bigger amount of energy to work. This is a clear challenge for the evolution of Artificial Intelligence, affecting the use of energy resources and the environmental sustainability.
The growing energy footprint of Artificial Intelligence
Alex De Vries, a PhD student at the VU in Amsterdam who studies the energy costs of emerging technologies, and also founder of the digital sustainability blog Digiconomist, in his study ‘The growing energy footprint of artificial intelligence’ (a peer-reviewed analysis published in October on Joule) hypothesized that if each Google search in a year used Artificial Intelligence (AI), it would consume approximately the same amount of electricity used to provide energy to a small country like Ireland (29.3 TWh per year). De Vries’ study is based on specific unchanged parameters, such as the growth rate of AI, its availability and the full capacity of servers. He explained the results of his research with a metaphor: a single LLM interaction can consume as much energy as leaving an energy-saving LED light bulb on for an hour.
De Vries also points out the case of an American AI company, which consumed about 433 megawatt hours (MWh) to train its multilingual AI text-generating system, that is enough to power 40 average houses in the United States for one year. And, considering the growing demand for AI services worldwide, it is very likely that AI-related energy consumption will increase significantly within the next few years.
Furthermore, the researcher highlights that if current trends in terms of capabilities and adoption of Artificial Intelligence were to continue, one of the leading companies in the graphics card sector, which today is able to provide 95% of the AI processing kit requested by the market (thus, effectively holding a monopoly), could distribute 1.5 million servers per year by 2027. If all these servers would operate at full capacity, they could consume at least 85.4 TWh of electricity per year: a quantity greater than the average annual energy consumption of many small countries.
How can Hybrid AI reduce the costs of Artificial Intelligence?
Hybrid AI can help reduce the costs related with implementing and running AI systems and, so, minimize the impact of Artificial Intelligence for what concerns energy use.
Here are some ways that can make this process accomplished:
- Resource optimization: Hybrid AI allows to use heterogeneous resources more efficiently, deploy less intensive workloads on edge devices or less powerful hardware, thus saving on computing and infrastructure costs;
- Scalability: the hybrid approach makes possible to scale the infrastructure flexibly as needed; hence, for example, during peak loads it’s possible to allocate more cloud resources and, the other way round, release them when demand decreases, enabling more dynamic and efficient cost management;
- Use of lean models: optimized AI models help reduce computational power requirements and its related costs; this is particularly important in cases where the simplest models still fully meet the needs of the application;
- Edge computing: by moving some of the decision making to edge devices, it’s possible to reduce the need to transfer large amounts of data across networks, thus saving resources on data transmission and leveraging local computing capacity;
- Adaptability to available resources: Hybrid AI allows to adapt computing resources based on the specific characteristics of the workload and save on infrastructure costs; in situations where it’s possible to use less expensive local resources rather than expensive cloud services, this is the best way to reduce the energy impact;
- Using managed services: adopting managed cloud computing services to deploy Artificial Intelligence systems reduce operational costs and make the management of infrastructure lean and simple;
- Model lifecycle optimization: monitoring and optimizing AI models over time help maintain good performance keeping costs low; this includes recurring training, model compression, and parameter optimization.
By implementing these principles, organizations looking to innovate their way of doing business, expand pro-digital culture and increase digital trust, can create an efficient Hybrid Artificial Intelligence strategy focusing on two important aspects that must be considered as equals: meeting the needs of the application of AI, but handling the flexibility of AI-related costs and the impact of the use of this technology in terms of energy sustainability.