A ChatGPT query demands 10 times more electricity than a traditional Google search

Apps
A ChatGPT query demands 10 times more electricity than a traditional Google search

Artificial intelligence, despite its brilliance, leads to a significant overconsumption of many data centers, which are called upon for their computational power in tools like ChatGPT. The rise of AI fundamentally redefines energy consumption paradigms, with data centers evolving from standard computing infrastructures to strategic entities for tech companies, resembling true energy-hungry giants of the digital economy.

Processing a simple query through ChatGPT requires over ten times the electricity needed for a Google search. This staggering difference highlights the profound transformation in computing needs, where AI sets new standards for power usage. A few years ago, a 30-megawatt data center was considered a substantial project, whereas today, tech players are comfortably considering super infrastructures of 300 megawatts or even 1 gigawatt to meet the growing demands of AI.

Looking ahead, the forecasts are even more striking: the largest global tech companies plan to invest $1 trillion in new data centers over the next five years, confirming the underlying trend. This energy revolution has become a global challenge, with data centers currently accounting for over 1% of global electricity consumption, a figure expected to double by 2030.