Extrapolating AI Data Center Power Demand and Assessing Its Potential Impact on U.S. Competitiveness
Larger training runs and widespread deployment of future artificial intelligence (AI) systems may demand a rapid scale-up of computational resources (compute) that require unprecedented amounts of power. In this RAND report, the authors extrapolate two exponential trends in AI compute to estimate AI data center power demand and assess its geopolitical consequences. They find that globally, AI data centers could need ten gigawatts (GW) of additional power capacity in 2025, which is more than the total power capacity of the state of Utah. If exponential growth in chip supply continues, AI data centers will need 68 GW in total by 2027 — almost a doubling of global data center power requirements from 2022 and close to California’s 2022 total power capacity of 86 GW.
Given recent training compute growth, data centers hosting large training runs pose a particular challenge. Training could demand up to 1 GW in a single location by 2028 and 8 GW — equivalent to eight nuclear reactors — by 2030, if current training compute scaling trends persist.
The United States leads the world in data centers and AI compute, but exponential demand leaves the industry struggling to find enough power capacity to rapidly build new data centers. Failure to address bottlenecks may compel U.S. companies to relocate AI infrastructure abroad, potentially compromising the U.S. competitive advantage in compute and AI and increasing the risk of intellectual property theft.
More research is needed to assess bottlenecks for U.S. data center build-out and identify solutions, which may include simplifying permitting for power generation, transmission infrastructure, and data center construction.
Key Findings
Exponential growth in AI computation is driving unprecedented power demands that could overwhelm existing infrastructure
- Global AI data center power demand could reach 68 GW by 2027 and 327 GW by 2030, compared with total global data center capacity of just 88 GW in 2022.
- Individual AI training runs could require up to 1 GW in a single location by 2028 and 8 GW by 2030, although decentralized training algorithms could distribute this power requirement across locations.
Permitting challenges for power infrastructure and data centers are causing significant delays to data center projects
- Insufficient power generation is increasing wait times for grid connections, with grid connection requests taking four to seven years in key regions like Virginia.
- Transmission line projects face complex multistate permitting processes and local opposition, delaying power delivery to suitable sites.
- Data centers struggle with local and state permits, particularly for on-site backup generators and environmental impact assessments.
- Environmental commitments and regulations limit the use of readily available power sources, forcing reliance on harder-to-scale renewable options.
A lack of data center infrastructure in the United States could shift construction to other countries
- U.S. companies are exploring expansion in countries offering better power availability and faster permitting.
- Countries with more compute access can deploy AI at larger scale, potentially gaining economic and military advantages.
As AI models get more capable, securing compute becomes increasingly challenging, particularly abroad
- Infrastructure hosting advanced AI models will likely face sophisticated cyberattacks.
- These risks increase significantly when compute is located outside U.S. borders, where oversight is limited.
Recommendations
- Model future power grid supply against data center demand, factoring in reliability requirements and power usage effectiveness.
- Research efficiency improvements that could reduce power requirements for AI, such as more energy-efficient AI chips.
- Study compute scaling bottlenecks, including latency constraints and data scarcity effects.
- Analyze how environmental review processes and permitting requirements affect power generation and transmission infrastructure.
- Evaluate emerging power sources for AI workloads, including small modular reactors and geothermal energy.
- Assess how federal authorities, such as the Defense Production Act, could address energy shortfalls.
- Examine private-sector capacity to fund and develop necessary power infrastructure.