Data centers — you can either be for them, linking the term with economic development and AI innovation, or against them, wary of environmental consequences and AI’s reach. Or maybe you’re somewhere in between.
One thing is certain: Data centers consume vast resources to power and cool the thousands of servers that run nonstop to transmit digital data. This demand strains local power grids and contributes to carbon emissions, water use and infrastructure challenges.
Data centers have powered the internet for decades (just ask anyone familiar with Northern Virginia’s “Data Center Alley”). But here’s the latest wrinkle: more AI means more computing power, greater demand for data centers and heavier strain.
When people say, “Don’t use AI. Just Google it instead,” they overlook a key fact.
If you’re worried, the answer is less AI, right? It can’t hurt, but just stopping using generative chatbots and going back to finding answers with search engines isn’t the answer. When people say, “Don’t use AI. Just Google it instead,” they overlook a key fact. Google searches need lots of energy, too.
The oft-cited claim that an LLM query uses 10 times more energy than a Google search comes from a 2023 study by researcher Alex de Vries, backed up in a 2024 Goldman Sachs report.
But let’s take a closer look, and take it with a grain of salt, because we’re relying on company claims, not third-party stats.
As of 2025, the average ChatGPT prompt uses about 0.34 watt-hours or “about what an oven would use in a little over one second,” according to OpenAI’s Sam Altman.
The stat for a Google search (which comes from a 16-year-old blog post) is about 0.3 watt-hours. If we trust the companies, that’s a pretty close tie, albeit outdated because Google hasn’t released new stats since then.
Google has come a long way since last updating its search stat. On one side, data centers have grown more efficient. On the other, the tech giant has leaned big into AI. With no easy way to turn off AI Overview, the tech is now part of every query.
Today, when you do a quick Google, you also “do AI.” Other sneaky AIs platforms are built into tools you already use, like Apple Intelligence or Microsoft Copilot.
Data centers aren’t going away because computing isn’t going away, even if the AI bubble pops tomorrow. As for solutions to the energy conundrum, it depends on who you ask.
Supporters say the good outweighs the bad, pointing to job creation and private investment, though much of that growth comes from construction rather than long-term operations.
Residents, especially in rural areas, often see it differently, worrying about water use, resource strain and landscape erosion. Activists, meanwhile, argue that the focus should be on reducing AI use and ensuring that data centers run on renewable energy sources — not fossil fuels like natural gas.
Data centers are part of the AI story, but not just about AI. This isn’t a defense of AI’s unique environmental impact that extends far beyond what we’ve seen before, but a recognition that every digital action contributes to data centers’ growing footprint. And a new facility could soon appear closer to home than you might expect.
Is your region showing signs of data center expansion, such as zoning changes, industrial investment or energy infrastructure buildouts? Let me know.