Data centers and energy, Part 2: Truths and myths

Source: S&P Global Media Portal via S&P Global .

The first two years of the generative AI revolution have been a whirlwind of changes in the data center and energy sectors. These have led to many questions about how the industry will move forward, as it reels from a lack of power. Will nuclear or gas be able to meet the untenable demand? Can data centers address the power problem themselves by being more flexible with their loads? Leading experts at S&P Global Inc. address these concerns and more, working to clarify thinking on the subjects as the industry moves toward an uncertain future. S&P Global Market Intelligence 451 Research recently hosted a webinar in which a team of experts addressed these issues and gave their opinion on whether they are truth, myth or lie somewhere in between.

The Take

The GenAI revolution has been a source of scrutiny and speculation. The hyperscalers are investing heavily in nuclear, hoping to procure large amounts of clean energy. Long lead times, however, make these projects difficult to execute in a timely manner. Gas-fired power generally takes less time to deploy, but high demand has created supply chain delays that prevent it from being a quick fix. While behind-the-meter power is getting more airtime, it is also not an ideal scenario — data center operators are in the business of data centers, not power generation. With increasing shortages of qualified workers across both industries, behind-the-meter power for data centers only compounds the competition among providers for skilled labor.

Ideally, data center operators will partner with experienced firms to support on-site generation. Some are spinning off sister companies, such as EdgeConneX’s PowerConneX, to support growth across multiple markets, but spinoffs into additional industries or segments are not necessarily ideal from funding and management standpoints. Even as operators find more workarounds to power shortages, policy will largely dictate what type of power is available, how much they can consume, where it is available and how much it will cost to build in those areas.

Truth or myth

Our team of experts was composed of Tony Lenoir, an associate director focused on the power and renewables space; Adam Wilson, a senior principal analyst specializing in wind, solar and energy storage research; and Dan Thompson, principal research analyst leading the data center services and infrastructure team, who is charged with keeping tabs on data center trends globally. In the webinar, they addressed the following statements and questions:

Nuclear is the answer to the future for data center needs

There is no question that nuclear is a good fit for data centers. Nuclear is clean and firm, meaning it is reliable and available 24/7. It can meet the growing energy needs for data centers and help to satisfy big-tech energy commitments. Significant investments have been flowing into the industry, largely driven by the hyperscalers.

As far as nuclear being a silver bullet for data center energy, unfortunately, it is not so easy due to the time it takes to deploy new nuclear capacity. Historically, nuclear has dealt with two major bottlenecks: permitting and financing. On the permitting front, the Nuclear Regulatory Commission is the primary authority for nuclear licenses and regulations in the US. All nuclear power plant applications are subject to an environmental review, a safety review and an antitrust review. Historically, this process takes three to five years. On the investment front, it has been challenging to finance nuclear projects because of the capital intensity required. This doesn’t consider the three to six years a project will spend in interconnection queues.

This all amounts to lengthy and highly variable timelines for the development of new nuclear power plants. Compiling planning, permitting, pre-build and build stages, the total time for a new operational plant spans 11-25 years. This lengthy process shows a significant misalignment with current data center needs. Nuclear looks great on paper, but actually deploying new nuclear capacity is more like a marathon than a sprint. Most other forms of energy generation or storage take significantly less time to deploy, with battery and solar being the quickest at three and five years, respectively. Gas and onshore wind take about six years from planning to operation.

Nuclear is at the center of the new US administration’s power road map. Among other things, the US has reaffirmed its commitment to the COP 28 declaration to triple global nuclear capacity by 2050. For the US specifically, the goal is to expand nuclear capacity from about 90 GW today to roughly 400 GW. On May 23, 2025, four executive orders were signed in support of nuclear energy. The objectives of these orders are to encourage investment into new and existing nuclear projects, streamline the permitting process, speed up the deployment of advanced nuclear reactors, and to develop and strengthen the domestic nuclear industrial base by going back to domestic mining and enrichment of uranium.

At the beginning of the year, there were questions about the fate of tax incentives for nuclear development. At the time, public statements by government officials and actions being taken suggested cautious optimism. Fast forward to now, and nuclear incentives from the Inflation Reduction Act survived the recently approved US tax and spending bill. We can still be cautiously optimistic that it will continue to evolve in the right direction, particularly on the permitting front, which leads us to expect development timelines to compress over the next several years.

Will small modular reactors be better than traditional nuclear?

This technology is still in the early stages, but on paper, small modular reactors (SMRs) should take less time to develop. They also now have the full weight of the US government to make that a reality. The big difference between SMRs and traditional nuclear plants is that SMRs will come with a more prefabricated or modular approach to manufacturing, which in theory will accelerate construction (and lower costs over time). This all remains to be seen, however, as we don’t expect the first SMR to come online until the early 2030s.

Gas generation will save the day while we wait for nuclear

Data center owners and power producers like gas-fired power because it is scalable and dispatchable. Gas power also offers the ability to implement carbon capture technology. This makes gas plants a viable component in the overall transition to low-carbon energy. Interest in gas power generation has skyrocketed. This has led to a boom in gas turbine orders in the last 18 months. In 2024, orders for new turbines topped 14 GW, which is the highest amount since 2001. Momentum remains strong this year: In the first quarter, the amount of capacity ordered came in at 7 GW. The backlog of the 60-Hz turbines used in the US was about 40 GW as of March 31. Furthermore, if we look at interconnection queues as of June 2025, capacity for gas projects was up 159% in the US year over year.

With all that said, original equipment manufacturers are “suffering from their own success,” with growing backlogs and maxed-out supply chains stretching delivery times. All the demand is also pushing the price of gas turbines higher. This brings into question the competitiveness of gas-fired generation against other energy technologies, particularly solar and battery storage. Expediting the manufacturing capacity of the materials is a challenging proposition. Things like rotors and blades are only produced by a small number of global providers, and lead times range from 12-24 months. Alloys used in these parts are also expensive and difficult to produce in terms of rare earth metals. Ultimately, gas does not seem to be the quick fix that will save the day.

Data centers can solve their power-generation limits by adjusting load profile or location to conform with power grid limitations

Historically, this would be considered a myth; however, it is beginning to evolve toward truth. Earlier this year, an article from Duke University suggested more capacity could be found on the grid if data centers and other large loads could be more flexible. The challenge with the world as it is today is that data centers have not been designed with flexibility in mind. They do not have systems in place to respond to the grid. Asking existing data centers to come off-grid today would amount to the activation of diesel generators. According to EPA (US Environmental Protection Agency) regulations, this is only allowed in the case of an emergency. While some states have defined an emergency as instances where the grid is short on power, most would agree that data centers running on diesel is not the best option.

Even so, hyperscalers are leaning into the idea. Google, for instance, has found that if it shifts the load of its AC system, it could potentially shave enough load to be beneficial to the power grid. Google’s next generation of data centers may be able to do this. If companies have time to design for this, future data centers could employ flexibility. As it stands, the design is not in place.

The notion of shifting workloads has its own challenges. If a company were to move a workload so that the power demand is less, it would mean the company would have to invest in similar or duplicate infrastructure in another location. The hyperscalers, particularly Google, are already doing this to some extent. If you have an Android phone and you wake up in the morning to see people tagged in photos or they show you what you were doing years ago, there is a reason it happens overnight. They shift the load to when the power grids are less stressed. This, however, is quite context-specific, and many workloads cannot be shifted temporally as they are vital to operations. While this sort of flexibility on a large scale is currently a myth, companies are working toward a future where this is true.

Data center companies can generate their own energy entirely rather than rely on the grid

Data centers are still largely reliant on the grid; however, there is a lot of movement toward behind-the-meter power. Hyperscalers don’t necessarily want to do this as it introduces more variables where things can go wrong, but it is becoming the fastest route to power in some places. Utility power is also generally viewed as reliable, and it is one less thing to have to deal with. It is possible for the facilities to be supported by behind-the-meter power — one example is xAI in Memphis. It doesn’t seem to be a great long-term or permanent solution. What the push in this direction shows is the willingness of these companies to do whatever it takes to get facilities online and servicing customers.


What is an AI Datacenter?

Want insights on datacenter trends delivered to your inbox? Join the 451 Alliance.