When Google published its Environmental Report 2024, it was already clear how far the company was from achieving its goals. This was not only due to the ambitious nature of the targets, but also to the company's emissions trend. The tech giant, which has one of the most committed green agendas, has pledged to use only clean energy by 2030.
For years, Google and other major tech companies— committed to sustainability have made major efforts to reduce their carbon footprint. These efforts include an increasing reliance on renewable energy and constant attempts to improve energy efficiency. The challenge is so great that companies are exploring all kinds of solutions. Microsoft, for example, has experimented with underwater data centers to improve server cooling. At the same time, other companies have built facilities at or near the Arctic Circle to reduce the energy costs of cooling rows of racks.
In the midst of these efforts, the advent of generative AI has hit like a cluster bomb. The huge energy consumption of this technology has led to an increase in emissions for companies such as Microsoft and Google. According to this year's report, Satya Nadella's company has increased its greenhouse gas emissions by 29% since 2020. Google's report contains similarly worrying figures. In 2023, the company's emissions increased by 13% and by 48% since 2019.
Google itself cited the growing demand for artificial intelligence services as a contributing factor to the increased energy consumption of its data centers. It is important to note that the results provided by AI models, especially the complex ones, require significant computing resources. However, this problem is more complex than it seems at first glance.
The role of AI
One clear fact is that queries to a chatbot like ChatGPT have a much higher computational cost than other tasks they could replace, such as a Google search. However, many variables come into play here, especially in a scenario where the industry is just getting started.
At the moment, the future status of many free services is uncertain. This will affect their acceptance and the resource consumption of data centers. It is also unclear how the industry will manage AI workloads on servers. So far, these tasks have been performed on Nvidia fries, which are designed for graphics and AI processing. However, specialized processors have already been developed to handle these types of queries as efficiently as possible.
The major technology companies have developed their fries to meet the requirements of the services they offer. A new generation of processors is being developed specifically for generative AI workloads. Even OpenAI is considering developing its fries, which shows the importance of having the most efficient hardware to achieve economies of scale in this area.
And this efficiency is exactly what Google — and others — are aiming for to combat the increasing energy consumption of AI. They also plan to increase the use of renewable energy to limit emissions. The priority is to control the uncontrolled increase in electricity consumption.
A few figures will suffice to get an idea of the scale of data centers. In 2022, these facilities consumed around 460 TWh of electricity — more than the annual consumption of France. However, the International Energy Agency predicts that this consumption will double to over 1,000 TWh by 2026, surpassing the annual consumption of Japan, the world's fourth largest economy.
Water consumption in data centers, which are used to cool servers, is another challenge for large technology companies. As artificial intelligence requires more computing resources, servers work harder, generate more heat and require more advanced cooling. As water is often used for this task, its consumption increases, which also has an impact on the environment.
The challenge for the AI industry is therefore to develop more efficient systems. This should go hand in hand with market maturity and the spread of pay-per-use initiatives that encourage users to optimize their resources.