Technology

OpenAI Warns the White House: Don't Power AI With Natural Gas

OpenAI warned the White House that powering AI data centers with natural gas would harm the environment and recommended renewable energy instead. The company outlined the enormous electricity demands

Martin HollowayPublished 3w ago5 min readBased on 1 source
Reading level
OpenAI Warns the White House: Don't Power AI With Natural Gas

OpenAI Warns the White House: Don't Power AI With Natural Gas

In October 2024, OpenAI sent a detailed letter to the White House's science and technology office about a growing problem: the enormous amount of electricity that artificial intelligence systems need. The core message was clear: using natural gas to power the data centers that run AI would create serious environmental damage, and there are better options.

Think of a data center as a giant warehouse filled with computer servers stacked tightly together, all working around the clock. AI systems require far more power than normal computer operations—they demand constant, intense computing across thousands of machines simultaneously. Right now, companies are struggling to keep up with this demand while still meeting their promises to reduce carbon emissions.

Why AI Uses So Much Power

Training an advanced AI system takes months of continuous computing work across thousands of specialized processors. Once the AI is built, answering user questions in real time also requires enormous computing power spread across different locations.

The power demand is strikingly high. A standard office data center might use 5 to 10 kilowatts per rack of servers. AI data centers routinely use 40 kilowatts per rack—and some can reach 100 kilowatts when using advanced cooling systems. This creates real challenges: the buildings need much stronger electrical systems, and cooling all that heat becomes a major engineering problem.

The Natural Gas Problem

Worth flagging: OpenAI's letter explicitly warned against using natural gas as a quick fix. The company argued that gas-powered data centers would create carbon emissions equivalent to entire countries' yearly output. This directly contradicts the broader goal—across the tech industry and globally—of reducing greenhouse gas emissions.

Companies are considering natural gas because electricity grids in major U.S. regions don't have enough renewable power available right now. But OpenAI pushed back: that's not a good enough reason.

The Better Path Forward: Renewables and Storage

OpenAI's proposal is to build dedicated renewable energy sources—solar farms and wind installations—directly connected to data centers through long-term contracts with power companies. This is more expensive and takes more time to set up than just connecting to existing natural gas plants, but OpenAI argued it's necessary.

The company also highlighted the role of battery systems and other energy storage devices. Since the sun doesn't always shine and the wind doesn't always blow, storage helps keep AI systems running smoothly even when renewable sources aren't producing full power at that moment.

Regulatory Roadblocks

OpenAI's letter raised another issue: getting all the necessary government approvals is slow. Federal agencies, state utility commissions, and local governments all have a say in whether a new data center and its power sources can be built. That can take years.

Analysis: This is a real tension. Renewable energy projects to power AI often wait in interconnection queues for three to five years before they can even connect to the electrical grid. Meanwhile, AI development timelines move faster. OpenAI suggested streamlining these approvals for AI infrastructure projects while still protecting the environment.

Why This Matters for America

OpenAI also touched on a competitive concern: if companies can't build enough data centers in the United States, they may move AI development overseas. That could mean losing economic advantage and reducing control over where sensitive artificial intelligence systems are built and trained.

Some government and enterprise customers need their AI systems to stay within U.S. borders for security or legal reasons, which requires even more domestic computing capacity.

Looking at the Bigger Picture

This debate echoes similar struggles from the past. When cloud computing took off in the 2000s, the first priority was speed—build it fast, worry about efficiency later. Eventually, as the industry matured, environmental and operational concerns became just as important. OpenAI's submission suggests that AI infrastructure is following the same arc, starting with raw capacity but gradually incorporating environmental responsibility.

Design and Supply Challenges

OpenAI also outlined technical recommendations: AI data centers should be purpose-built from scratch rather than adapted from existing buildings. Purpose-built facilities can use more efficient cooling systems and electrical design. The company emphasized the need for fast connections between servers during training and between data centers for answering user queries worldwide.

One constraint that limits how fast companies can expand: there simply aren't enough AI chips available. OpenAI recommended that the U.S. government support domestic chip manufacturing to ensure enough supply.

The Bigger Shift

In this author's view, OpenAI's message to the White House marks an important moment. A leading AI company is saying that building sustainable infrastructure isn't just good for the planet—it's essential for the business itself. That's a significant shift from earlier days of tech development, when environmental costs often came later, if at all.

By tying environmental responsibility to competitive advantage and national security, OpenAI's submission could influence how the federal government directs support for AI development. It may also push other AI companies toward the same sustainable approach, creating faster industry-wide change than government rules alone might achieve.