The area, already home to multiple data centers — a foundation for AI and other cutting-edge tools used by governments — is eager to attract more of those facilities. But there's a potential hurdle: Power.
Utah County Commissioner Amelia Powers Gardner is confident that the county, which includes the city of Provo and parts of the Salt Lake City metro area, is ready to meet that challenge, even if the mainstream has yet to fully grasp the vital role energy plays in the future of artificial intelligence.
“I still think AI is throttled somewhat because of the lack of servers,” Gardner told Government Technology, echoing a view common among backers of the quickly growing tech that state and local agencies are adopting at a dizzying pace.
Part of the reason for that stems from the massive power needs of AI. Some experts estimate that completing a task via generative AI can gobble up 33 times more energy than other “task-specific software.” Simply put, ongoing growth of AI and data centers will require more than simply running power lines to the facilities and flipping a switch.
“All these things depend on power,” Gardner said.
But Gardner is optimistic. Utah's new Operation Gigawatt calls for increased transmission capacity along with developing nuclear and geothermal sources of energy, among other tasks. The goal is to double the state’s power production in a decade, enabling it to handle not only AI but electric cars and other innovations.
Such a rosy view, however, may soon face some harsh tests, assuming trends hold. As AI moves into daily life and tech companies rely more on giant data centers, the energy needs of all that progress are coming into clearer view, with potential hurdles that could influence both the use and reputation of some of the newest tools available to public agencies.
“While the environmental impacts of AI, especially generative AI, are starting to get a little more air time in mainstream outlets, many people, I think, are still unaware of the scale and the severity of the issue,” said Leila Doty, privacy and AI analyst for San Jose, Calif., at a recent AI conference.
She said that by 2030, AI in the U.S. could be using more electricity than all the refrigerators in the country.
The inescapable math of the issue eventually will demand more awareness. Various data points hammer home the reality:
- A single data center can consume “three to 15 times” the power needed by 100,000 homes. A single megawatt can power just more than 600 homes in Utah, according to a fact sheet from Operation Gigawatt. By contrast, a single data center or AI facility consumes hundreds of megawatts.
- AI’s current rate of growth requires the doubling of computing power every 100 days.
- Thanks in large part to data center energy needs, Google’s greenhouse gas emissions were about 50 percent higher in 2023 than in 2019. Microsoft, meanwhile, reported a 30 percent increase in carbon dioxide emissions since 2020.
- By 2030, data centers could account for 9 percent of U.S. electricity use, according to the Electric Power Research Institute, a nonprofit.
POWER STRUGGLES — GREEN OR OTHERWISE
Some are looking to retro solutions — witness Microsoft’s plan to reopen the Three Mile Island Nuclear Generating Stationso that it can power data centers and AI. A partial core meltdown at the plant in 1979 helped to contribute to a backlash against nuclear power in the U.S. The effects of that are still felt today, especially in the long-term impacts that came from stopping new construction of nuclear power plants.
Amazon is also turning to nuclear. Last year the company reportedly signed a 10-year power purchasing agreement with the Susquehanna nuclear plant in Pennsylvania to bring energy to a data center predicted to need 960 megawatts.
Getting past the potential power-need hurdles — and quickly — means that utilities are “reinvigorating coal plants,” said Tamara Kneese, who directs a new climate and justice program at the Data and Society Research Institute.
“Companies powering data centers are not actually using green energy," Kneese said, "in part, because there isn’t enough green energy, prompting the need for more traditional energy generation.”
Tech providers also are trying to get themselves closer to the front of the line when it comes to power, according to reports. That means deals with utilities that let tech providers — the companies behind data centers and AI — basically plug directly into power plants, raising questions about how much relatively cheap power is left over for regular consumers and other customers.
“The companies, they’re very frustrated because they have a business opportunity now that’s really big,” Bill Green, the director of the MIT Energy Initiative, recently told The Associated Press. “And if they’re delayed five years in the queue, for example — I don’t know if it would be five years, but years anyway — they might completely miss the business opportunity.”
All this matters to state and local governments because more of their employees and constituents are coming to rely on AI.
While the main AI-related worries of those government workers tend to focus around governance, ethics and failure to align with goals, some 30 percent of survey respondents were concerned about the “lack of technology infrastructure” for AI.
Further embrace of AI in the public sector will no doubt depend at least indirectly on how much power can be generated and transmitted — without taking away relatively cheap power from other sectors of the economy, which could anger constituents and even leaders of other businesses.
Intashan Chowdhury, former borough administrator of Prospect Park, N.J., said the borough uses AI as a customer service tool to answer a variety of residents' questions. He stressed the importance of transparency with residents, around both the costs and benefits of AI.
“Municipalities must clearly communicate the benefits of AI — such as improved traffic flow, enhanced public services and cost savings — while outlining efforts to address environmental concerns,” he said.
Indeed, transparency is a big AI concern among the public, at least according to a Gallup poll released in August.
PUBLIC TRUST ISSUES
In a sense, people in the U.S. are still making up their minds about AI. The poll found that 56 percent of Americans view AI has having a “net neutral effect — doing equal amounts of harm and good.”
But only 31 percent of those polled say that AI does more harm than good — down from 40 percent from the previous year.
While the energy needs of AI do not appear as a concern among those who took part in the poll — worries are focused on AI taking away jobs, and whether business will use the technology in trustworthy ways — transparency could go a long way to making more people fans of AI.
But rather than call for energy-related transparency, poll respondents said they want businesses to be open about how they use the technology, which data is training AI models, privacy protections and other issues.
Translated to the public sector, government agencies would do well to make sure they have a good education in AI, along with its limits, strengths and cost, said Irina Raicu, director of the Internet Ethics program at the Markkula Center for Applied Ethics at Santa Clara University.
Not only that, but government has a strong card to play: procurement. Government could use its procurement power “to put things into contracts,” Raicu said at that recent AI conference. Doing so could help agencies bring more transparency to AI, or even set environmental standards anchored around AI tools.
It’s not only energy that matters. AI and data centers require massive amounts of water for electricity generation, cooling and chip production, as speakers at that conference noted.
A Google data center in Oregon, for instance, reportedly used more than 355 million gallons of water in 2021, triple the amount from five years earlier. That situation led to a lawsuit filed by the municipality where the center was located, with the center having consumed more than one-quarter of that area’s water supply.
More generally, a mid-sized data center needs 300,000 gallons of water each day, according to one estimate. That’s enough water for 1,000 households in the U.S. All told, the country may have more than 3,000 data centers, although some counts exceed 5,000. About one-fifth of them are located in areas with stressed watersheds.
THE SEARCH FOR THE ANSWER
The power issue so far has not seemed to discourage backers of AI and data centers, nor has it meant that further growth in AI is doomed.
Talk about using microgrids to deal with power needs, for example, seems to be building.
Microgrids are essentially small-scale power plants, according to one description, and they can help “balance load demand, integrate renewables, and reduce reliance on centralized grids, ensuring both reliability and sustainability.”
Such operations can be built more cheaply and quickly than traditional power plants, and offer flexibility as data centers expand or move to new energy sources.
Back in Utah, County Commissioner Gardner and her allies are putting faith not only in nuclear power but the state’s geothermal and natural gas capabilities, to which more consumers are turning as coal consumption continues to decline.
“We need to be looking at clean energy,” she said, cognizant that many voters view clean energy and climate change as vital issues.
With solar power growing in the state — it accounted for 11 percent of Utah’s net generation of electricity in 2023, up from 0.1 percent in 2015, according to the U.S. Energy Information Administration — she can even imagine data centers running mainly on natural gas while using solar as a supplemental power source.
Some public officials and government technology vendors are looking beyond that, and they argue that AI can help solve its own power problems.
“The cost of not adapting AI tools could potentially cost more energy,” said Polimorphic CEO Parth Shah, whose company has a stake in the matter, as it sells AI-based constituent services, including those used in Prospect Park, N.J.
The base logic there is that AI will help energy companies, utilities and government agencies find ways to conserve energy and deliver services more efficiently.
“A single call to a trained AI model eliminates the need for extensive paperwork, employee hours spent navigating bureaucratic processes and other costs of manual work,” Shah said. “Multiply that by the thousands, or even millions, of phone calls received by an agency annually.”
Local government leaders such as Chowdhury also point to recent projects such as one in Tucson, Ariz., where AI-powered traffic management technology has reduced road delays by 46 percent.
“This innovation not only improved commute times but also cut vehicle emissions, aligning with sustainability goals,” he said.
The idea, of course, is that increasing use of AI will eventually help to significantly reduce the technology’s energy footprint over time — essentially, that AI will find a way to solve problems it is causing.
For now, though, skepticism remains, particularly from experts like Kneese who research climate impacts.
“The energy demands are not being balanced out by energy efficiency gains,” said Kneese.
Skip Descant contributed to this report.