US-INTERNET-SOFTWARE-AI-OpenAI

(Photo: by OLIVIER DOULIERY/AFP via Getty Images) This illustration picture shows the ChatGPT logo displayed on a smartphone in Washington, DC, on March 15, 2023. - Google on March 14, 2023, began letting some developers and businesses access the kind of artificial intelligence that has captured attention since the launch of Microsoft-backed ChatGPT last year. (Photo by OLIVIER DOULIERY / AFP)

In a world starting to become AI-driven, where chatbots, digital assistants, and smart programs are everywhere, we want more AI services. But there is a big problem: AI uses a lot of energy. 

A recent report by Alex de Vries says AI could use more power than whole countries.

The Energy Hungry AI Training Phase

The heart of AI development lies in the training phase. For AI models like ChatGPT and others, this process is incredibly energy-intensive. 

Hugging Face, a prominent AI company reported that its multilingual text-generating AI tool devoured a staggering 433 megawatt-hours (MWh) during training. 

To put that into perspective, it is enough electricity to power 40 average American homes for an entire year, reports highlight.

Hugging Face's tool is not alone in its voracious energy appetite; other large language models have reported even higher energy consumption during their training phases. 

This critical phase forms the backbone of AI's energy footprint.

The Power-Hungry Inference Phase

While the training phase indeed demands immense energy, AI's energy appetite does not stop there. The subsequent inference phase, where AI models generate real-time outputs based on user interactions, also comes at a significant energy cost. 

A prime example is Google, which processes up to a staggering 9 billion searches a day. Vries tells us that if every Google search were to utilize AI, it could gulp down a jaw-dropping 29.2 terawatt-hours (TWh) of power annually - equivalent to the electricity consumption of a country like Ireland.

Challenges and Unlikely Scenarios

While these statistics might raise alarm bells, it's crucial to consider the practical challenges in implementing such extreme scenarios. 

Alex de Vries points out that the costs associated with additional AI servers and bottlenecks in the AI server supply chain make full-scale AI adoption in Google searches unlikely in the short term. 

This implies that we are not on the verge of witnessing AI devouring as much electricity as an entire country anytime soon.

Projected Global AI Electricity Consumption

The future, however, paints a different picture. The demand for AI chips has skyrocketed, with NVIDIA, a leading chip manufacturer, reporting record-breaking AI-driven revenue.

They are expected to deliver 100,000 AI servers in 2023 alone. These servers could potentially consume up to 8.9 TWh of electricity annually, although that's still a fraction of global data center consumption. 

Nevertheless, Alex de Vries forecasts that by 2027, worldwide AI-related electricity consumption could surge by 85 to 134 TWh annually, equivalent to the power needs of countries like the Netherlands, Argentina, or Sweden.

Efficiency vs. Increased Demand

AI enthusiasts might argue that improving AI hardware and software efficiency answers the growing energy problem. 

However, de Vries cautions that this can lead to a paradox. As AI becomes more efficient, it opens the door to more applications and users, ultimately driving up the demand and, consequently, energy consumption.

Stay posted here at Tech Times.

Related Article: OpenAI Plans Developing Its Own AI Chips Amid Global Shortages

Tech Times Writer John Lopez

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion