As Texas confronts decades of water mismanagement and growing demands for electricity from data centers, the state’s top utility regulator, Public Utility Commission Chairman Thomas Gleeson, told a state House committee on Thursday that it’s critical to have a clear picture of how much water data centers use.
His testimony came as data center developers assured the House Committee on State Affairs that newly developed closed-loop cooling systems allow cooling fluid, sealed in a circuit, to absorb the heat from electric equipment and turn the fluid into vapor before it is condensed back into liquid.
For Dallas-based Skybox Datacenters, using a closed-loop system means its average data center uses less water than five typical households, said Haynes Strader, Skybox’s chief development officer.
Much of the nearly five-hour public meeting discussed water concerns and consumption from data centers, despite it not being a part of the agenda.
Before the 2027 legislative session, the state Senate Business and Commerce Committee is charged with recommending ways to balance the economic benefits of the data centers boom with impacts on landowners, private property rights, water infrastructure and community integrity.
Thursday’s meeting also represented a check-in for developers on the implementation of a new law regulating data centers as the Electric Reliability Council of Texas (ERCOT), the state’s grid operator, works to select the first “batch” of large-load projects that will go to the head of a long line for interconnection with the grid.
The new law, Senate Bill 6, applies to electricity customers using at least 75 megawatts, or the amount of a medium-sized power plant. The wide-ranging state legislation seeks to govern the growing number of data centers being planned and built out across the state, with Texas now set to surpass Virginia—the jurisdiction with the most data centers in the world—in new data center construction.
Assurances on Water Use
While the standard for older data centers was to cool their banks of computer servers with millions of gallons of water every day, much of which evaporated, those are the facilities of yesterday, a handful of developers told the committee.
A building at one of Skybox’s data center sites draws up to 30,000 gallons of water once to charge the closed-loop system, Strader said, noting that his home’s pool in Dallas uses a little less than that.
Michael McNamara, is CEO and co-founder of Lancium, a data center company based in Shenandoah, Texas, outside of Houston, and developer of
’s Stargate Campus in Abilene, slated to be one of the largest data center projects in the world.
He said that the city offered the project in construction 500 gallons of water a minute. Three of its eight buildings are up and running and more than 8,500 people are working at the facility. The site, which when completed is slated to be 10 square miles, consumed only 20 gallons a minute last month, McNamara said—less than 5 percent of the allocation from the city.
“We have a water shortage, but it’s a water shortage driven by shortages of engineering and money,” he said. “We can fix all of those.”
The data center developer hired an engineering firm to study the Panhandle’s water supply and is planning to share the results soon. At Lancium’s Childress, Texas, facility, the company plans to give back 1 million gallons of water a day to the Red River Water Authority, from which it pulls 11,000 gallons a day for its Childress data center.
McNamara said the company will access water through aquifers that farmers and authorities haven’t had the cash to acquire. “We have the budget and we’ll just give the water back,” he said.
The Texas Water Development Board reports that the share of the state in a drought has reached its largest extent since September 2023. While 85 percent of the state is in drought, extreme and exceptional drought is impacting almost a quarter of the state, according to the water board.
In Corpus Christi, years of severe drought coupled with increased water demands from the city’s petrochemical industries have nearly bankrupted the coastal area’s reservoirs. Without significant rainfall, the city is months away from disaster that, under worst-case scenarios, could lead to an industrial shutdown and partial evacuations.
While some new facilities continue to use evaporative cooling, said Dan Diorio, vice president of state policy with the Data Center Coalition, a trade group that represents major tech companies, it’s often in tandem with other cooling technologies.
Skybox has not used an evaporative cooling system since 2016, Strader said. Of the 800 megawatts of large data centers operational in Texas, only about 20 percent use evaporative cooling, Strader said.
In water-stressed areas, the data center industry often uses less water than other manufacturers, Diorio said. In Arizona, for example, data centers use less water than the semiconductor industry, beverage production, sand and gravel mining and golf courses.
Water cooling is only considered at Google data center facilities if local water sources are healthy and resilient, said Liz Schwab, who leads advocacy and market development for Google data centers. In locations facing threats of scarcity or depletion, Google searches for alternatives like air cooling or using reclaimed water, Schwab said.
The Interconnection Queue
More pressing even than water consumption for data center developers is grid interconnection under S.B. 6. They have become increasingly impatient with the state’s reworking of the large load interconnection queue, as Texas transitions to studying data centers’ transmission needs in batches rather than individually.
Electric Reliability Council of Texas CEO Pablo Vegas said the grid operator has seen significant challenges in the old ways of connecting large loads to the electric grid, especially given the increased volume and pace of requests.
The system in place before S.B. 6 had been built for a large load queue totalling 40 to 50 projects a year. But last year, ERCOT received 225 new interconnection requests and the waitlist for large facilities to get online has grown to 410,000 megawatts, ERCOT CEO Pablo Vegas said. “That’s a huge, huge, change since the last time we talked about the growth of data centers,” he said.
Under its previous planning process, by the time one data center finished its planning studies, the results would often have to be reconsidered almost immediately, as more projects joined the interconnection queue and changed the local transmission needs and reliability.
This story is funded by readers like you.
Our nonprofit newsroom provides award-winning climate coverage free of charge and advertising. We rely on donations from readers like you to keep going. Please donate now to support our work.
Donate NowThe consensus from early conversations with corporate stakeholders, including Google, Meta, CenterPoint, Amazon and OpenAI—all looking for grid capacity in Texas—was that the uncertainty in the current process creates undue risk for developers with existing interconnection requests. The proposed batch method aims to ameliorate that.
While the requests have continued to come in over the last year and a half, in the last six weeks, a big chunk of new projects came into the queue, Vegas said. The jump exceeded 130,000 megawatts, Vegas said, and around 87 percent are data centers. The smaller share are more traditional industrial developments.
As the interconnection process works now, ERCOT does not reserve transmission capacity for any customers. It led to multiple projects trampling each other as different building timelines overlapped, Vegas said, and altered the necessary transmission assets. But given the scale and magnitude of the queue and the projects’ large investments and fast construction times, that standard doesn’t work anymore for economic development, Vegas said.
It put developers who were investing billions of dollars in Texas into an untenable situation, Vegas said. “They could get approval, get funding and financing, start to develop and then see the requirements changing in the middle of that cycle,” Vegas said. “So we recognized that we need to change that process.”
ERCOT and PUC leaders hope that the batch study will alleviate the uncertainty that has developed across some of the state’s priciest developments.
The developers with millions, if not billions, of dollars tied up in projects stuck in the interconnection queue are worried about how much longer their timelines to get online will be stalled.
Lancium’s McNamara said there’s increasing concern that selection for the first batch will be unfair—potentially excluding projects that have and would continue to demonstrate significant financial commitments.
ERCOT leaders are ordering that admission criteria for the first batch be announced by its staff by June, with PUC consideration as early as July. Then ERCOT would deploy batch studies every year for projects to be considered.
The grid operator is looking for pathways to offer data centers interconnection that would include them in the first batch study. One scenario being considered is whether the project is developing its own power generation, or if the data center can pull back from the grid when demand is high.
McNamara proposed that loads associated with financing transmission projects should be fast-tracked into the first study group. Lancium has committed $600 million toward the state 765-kilovolt transmission project and McNamara said the company is able to contribute another $600 million to $1.3 billion for PUC deposits to show both the certainty of its data center project and hold the ratepayers harmless, McNamara said.
S.B. 6 aims to shift transmission costs to the large-load users so upgrades and new connection costs aren’t paid through residential and small commercial customer rates.
To do so, Gleeson said the state will be moving away from its current model that determines transmission charges for large industrial and commercial customers based on their energy use during four 15-minute peak times in summer months. Critics of the system have said it allowed industrial customers the unique privilege of being able to cut energy use during those periods, unlike residential users, and then pay a disproportionately small share for the state’s transmission services.
The utility commission is in the process of figuring out what changes need to be made to more fairly allocate costs, Gleeson said. They’re considering adding coincident peaks or establishing new rate classes for ultra-large loads.
“All of these rules really create a framework for us to be able to plan and interconnect large loads appropriately and reliably while also ensuring that the costs are borne by those who are putting them on the system,” Gleeson said.
The new law also empowers regulators to create rules about onsite power generation and establishes a required financial commitment earlier in the interconnection process to weed out unserious data center applications.
In the last few years, there were concerns that the number of megawatts looking to get online were exaggerated. It’s why Texas regulators are implementing different benchmarks to ensure that before infrastructure is built for prospective customers, the projects will actually pan out.
Stacey Doré, chief strategy and sustainability officer at energy company Vistra Corp., said the state needs to find a way to raise the bar on data center applications to verify that the projects have the maturity to develop.
“There’s no way there’s 400 gigawatts of data centers coming to the United States, much less Texas,” Doré said.
Criteria for large loads should require that developers have control over the site and even consider whether they’ve ordered equipment, Doré said.
But the switch has left developers who have been waiting for years to get onto the grid anxious if they’ll be among those receiving golden tickets, said Will McAdams, an energy lobbyist and regulatory consultant.
“There’s no clear path to power,” McAdams said. “Developers don’t know when they’ll get service, to what degree they’ll get service. They can’t plan.”
It’s left a dizzying landscape for the investors. Even if they’re considered for the first batch, they won’t know if they only get a portion of their requested load. “Which is incompatible with a lot of industrial processes,” McAdams said. “They need to know. So the planning and interconnection processes don’t align.”
About This Story
Perhaps you noticed: This story, like all the news we publish, is free to read. That’s because Inside Climate News is a 501c3 nonprofit organization. We do not charge a subscription fee, lock our news behind a paywall, or clutter our website with ads. We make our news on climate and the environment freely available to you and anyone who wants it.
That’s not all. We also share our news for free with scores of other media organizations around the country. Many of them can’t afford to do environmental journalism of their own. We’ve built bureaus from coast to coast to report local stories, collaborate with local newsrooms and co-publish articles so that this vital work is shared as widely as possible.
Two of us launched ICN in 2007. Six years later we earned a Pulitzer Prize for National Reporting, and now we run the oldest and largest dedicated climate newsroom in the nation. We tell the story in all its complexity. We hold polluters accountable. We expose environmental injustice. We debunk misinformation. We scrutinize solutions and inspire action.
Donations from readers like you fund every aspect of what we do. If you don’t already, will you support our ongoing work, our reporting on the biggest crisis facing our planet, and help us reach even more readers in more places?
Please take a moment to make a tax-deductible donation. Every one of them makes a difference.
Thank you,
