Protecting the Power and the Water Supply
Discussions about the impact of artificial intelligence (AI) on the supply chain tend to focus on all the potential benefits it offers for improving operations. But those improvements come at a cost that is just now starting to receive more attention: Implementing AI places considerable demands on power and water resources.
The data centers required by AI to process data are power hungry and could disrupt the grid in some areas within the next few years.
“Until about a year ago, people saw this power problem coming, but they didn’t really understand that we’re going to run out of power,” Bob Johnson, VP analyst at Gartner, said. The firm estimates the power required for data centers to run AI servers will reach 500 terawatt‑hours (TWh) per year in 2027, which is 2.6 times the level required in 2023.
In its recently released Special Report on Energy and AI, the International Energy Agency states that by 2030, power needs for data centers could reach 945 TWh, surpassing Japan’s total electricity consumption today.
But these numbers are still best guesstimates. “There is substantial uncertainty about the current and projected demand outlook for data centers and AI, and estimates from different sources in recent years have varied widely,” Alex Martinos, energy analyst at the International Energy Agency (IEA) and one of the co‑authors of the AI energy report, said. “Accurately determining electricity demand for AI and data centers remains challenging for several reasons, including the limited data collection and reporting—to date—on data centers’ electricity consumption.”
This data won’t be easy to collect for several reasons. First, there are many different AI models with varying power needs. Second, data centers aren’t required to release information about their power usage, and countries have not so far demanded that the owners supply it.
Another difficulty is the fast pace of AI evolution. The power required by the next generation of AI chatbots may be significantly less than that used by current systems. DeepSeek, a Chinese AI company, claims its new large language model AI could cut energy consumption by 45 percent compared to other chatbots.
There are also questions about how widespread the use of AI will eventually be. Companies that do not soon see the benefits of their investments in this technology may scale back that funding. That would reduce the demand for data centers and for the power they require.
Since there’s no assurance that power needs will be less than expected, however, power companies and data centers are seeking ways to deal with the anticipated shortages.
Bringing More Power Online
Demand for power in the U.S. has been relatively flat since the turn of the century, so suppliers have had few incentives to grow the grid. Power suppliers are scrambling to meet AI’s surging energy demands—but it won’t happen overnight.
“Grid connection queues for both supply and consumption projects, including data centers, are long and complex,” Martinos said. “Building new transmission lines can take up to eight years in advanced economies, and wait times for critical grid components have doubled in the past three years. Generation equipment is also in high demand, with turbine deliveries for new gas‑fired power plants now facing lead times of several years.”
Areas with the largest concentration of data centers, such as Northern Virginia, Dallas and Silicon Valley in the United States, are most likely to suffer the consequences of a power shortage. That could include rolling blackouts and other mandatory conservation efforts. Lack of available power could also put 20 percent of currently planned data center projects at risk of delay, Martinos said.
“If the electricity sector does not step up to meet these challenges, especially in the most impacted regions, there is a risk that meeting data center load growth could also entail trade‑offs elsewhere,” he added. “High electricity demand from data centers could impact priorities in other sectors, such as the electrification of homes or the manufacturing industry, or the affordability of electricity.”
To avoid these scenarios, data center owners are exploring options for adding power to the grid. Those solutions are not always realistic.
“Some of the major data center suppliers say they can use renewable energy,” Johnson said. “Well, the reality is that renewable energy doesn’t really work very well for data centers. You have to have 24/7 reliability, and wind and solar just aren’t that reliable.”
Currently, the only way to obtain the kind of power required for some of these data centers is through fossil fuels, he added. But using gas or hydrogen fuel cells for power generation has environmental consequences—more carbon dioxide released into the air. Hydropower isn’t an option, because it has no new capacity, and communities are often not receptive to the idea of building new hydropower plants.
Some promising new technologies could help. One effort involves using solar power to generate hydrogen through electrolysis; the hydrogen would be used to run power generation 24/7. Some companies are trying to tap the geothermal energy (superheated steam) a mile or so below the Earth’s surface for energy.
The Nuclear Option
Johnson considers small modular nuclear reactors (SMRs) the most viable long‑term solution for powering data centers. “SMRs ensure independence from grid power fluctuations by providing dedicated on‑site power for large data centers,” he said. “These advanced nuclear fission reactors offer reliable, uninterrupted power without emissions, making them a key solution for transitioning from fossil fuels.”
SMRs have components that can be factory‑made and transported for assembly on‑site. This would provide data centers with location flexibility without relying on commercial power availability.
There are hurdles that must be overcome before SMRs can begin providing power. One potential solution to the radioactive waste they produce is to reprocess it into usable fuel, which makes it significantly less radioactive and more easily disposed of. High costs and long approval times, especially for first‑generation SMRs, could further slow their adoption.
But Johnson is convinced these issues can be overcome.
“SMRs are the best green power solutions you’ve got,” he said. “They’re reliable, they go 24/7 and you don’t have to worry about the weather. Just the basic physics of them makes them very good for grid stabilization.”
Is AI the Answer?
The IEA report found that 50 percent of data centers under development in the United States are in pre‑existing large clusters, such as Northern Virginia. That raises the risks of local bottlenecks. Grid operators may be able to avoid these outcomes by offering tech companies incentives to locate data centers in areas where grids are less congested.
ISTOCK.COM/WHO_I_AM
MHI Solutions Improving Supply Chain Performance