An aerial view of the QTS Data center under construction in Phoenix, Arizona.

How to power the AI economy while supporting local communities

July 9, 2025
Updated on July 23, 2025
Wirestock Creators // Shutterstock

How to power the AI economy while supporting local communities

American leadership in artificial intelligence promises massive economic opportunities and transformation across industries: AI systems are poised to drive breakthroughs in drug discovery, enable critical infrastructure, and revolutionize sectors from healthcare to finance.

To drive these shifts, AI needs a massive amount of energy. By 2028, AI data centers are expected to account for . Data centers鈥 median size will more than double over the next decade, according to a from power provider . Some will require more than a gigawatt of power 鈥 enough for over 800,000 average American homes.

Balancing AI growth with local community well-being

Many Americans are concerned about how AI鈥檚 hunger for energy will impact them. Communities around the country are against the building of new data centers nearby, despite their potential to .

Chief among communities鈥 concerns are and how new data centers could . Growing pressure on the grid also increases the , and communities are alarmed about data centers鈥 potential pollution, noise, and water and land use.

Powering the AI economy without local communities bearing the brunt of higher energy bills, power outages, and pollution is a challenge, to be sure. But specific types of onsite power can create a 鈥測es, and鈥 to fuel AI鈥檚 growth and support communities.

What is onsite power?

Onsite power means physically producing electricity at the location where it鈥檚 used. For example, fuel cells are a type of onsite power that convert natural gas, hydrogen, or biogas into electricity without using combustion. Other power sources that data centers can deploy onsite include natural gas engines and turbines.

When data centers use certain kinds of onsite power, it creates several advantages for both local communities and AI鈥檚 growth.

  1. Lower impact on consumer energy costs. Some facilities use onsite power to go 鈥渙ff-grid,鈥 meaning they don鈥檛 connect to the electricity grid 鈥 and therefore that their energy costs don鈥檛 affect local residents鈥 utility bills. Others use onsite power as a primary source while maintaining a connection to the grid.

Utilities can also work directly with onsite power providers to produce energy for data centers in their service areas and 鈥渇ence off鈥 that coverage, preventing data centers鈥 costs from being passed on to consumers.

Approaches like these help keep consumer utility bills stable while providing reliable power for data centers to fuel AI.

  1. Less pollution. Some onsite power technologies like fuel cells produce significantly lower emissions than traditional energy sources because they don鈥檛 use combustion. Fuel cells also by more than 99% compared to traditional energy generation technologies, and use a that legacy combustion generators need.
  2. Smaller footprint. High power density鈥攑roducing relatively large amounts of power in a given space鈥攊s a major benefit of some onsite power sources. Gas turbines and reciprocating engines provide up to , while fuel cells can deliver double that 鈥 up to 100 megawatts on less than an acre. This small footprint can help ease communities鈥 concerns about data centers鈥 visual impacts and the loss of open space.
  3. Quieter. Excess noise among many communities near data centers. Onsite power sources like fuel cells are than other energy sources, operating at ~ 鈥 .

Aiming for a win-win for AI and communities

Interest is rising in onsite power generation for data centers: by 2030, .

This will help data centers deliver on their promised local economic benefits鈥攁nd AI鈥檚 broad transformative potential鈥攚ithout the adverse effects of higher energy costs, power outages, and more pollution and noise in American communities.

was produced by and reviewed and distributed by 麻豆原创.


Trending Now