Gartner predicts that hyperscalers' energy demand for generative AI will soon exceed electricity providers' capacity, leading to shortages by 2026. These shortages will limit new data center construction and increase operational costs, impacting the sustainability goals of tech companies.

According to Gartner, the energy demand from hyperscalers for generative AI will soon exceed the capacity of electricity providers. This will lead to energy shortages by 2026, hindering the construction of new data center infrastructure.
Data Centers Running Out of Energy
The rapid growth of AI and its associated data centers will lead to exponential electricity consumption worldwide. Gartner predicts that by 2027, 40% of AI data centers will be limited by electricity availability. This could slow down the establishment of the necessary new infrastructure, as expanding electricity production capacity may take several years.
Risk of Shortages and Rising Costs
The explosive energy demand from data centers fueled by generative AI (GenAI) will put a strain on energy providers. Gartner forecasts that this situation will create energy shortages, which will limit the construction of new data centers by 2026. These shortages are also expected to increase operational costs for large language models (LLMs), directly impacting AI-based service providers.
Environmental Impact and Sustainability
The carbon footprint reduction goals of hyperscalers may also be affected by this increased energy demand. Renewable energy sources will not be sufficient to power these infrastructures, making hydroelectric, nuclear, and fossil fuels necessary. Companies like Microsoft, Oracle, and Amazon have already announced their plans to use nuclear energy to meet these growing demands.
Reevaluating Goals and Optimizing Infrastructure
Given these energy challenges, Gartner advises companies to assess the potential impact of energy shortages on their generative AI projects. Data center operators will need to adjust their sustainability goals while exploring solutions to minimize energy consumption, such as edge computing and smaller language models.
Source : ICTjournal