Advanced Search
June 25, 2025
This article is the second in a series on the environmental impacts of data centers. Click here to read the first article on data center energy use.
Highlights:
Data centers have a thirst for water, and their rapid expansion threatens freshwater supplies. Only 3% of Earth’s water is freshwater, and only 0.5% of all water is accessible and safe for human consumption. Freshwater is critical for survival. On average, a human being can live without water for only three days. Increasing drought and water shortages are reducing water availability. Meanwhile, data center developers are increasingly tapping into surface and underground aquifers to cool their facilities.
Data center water usage closely parallels energy usage and carbon emissions. As data centers use more energy for their typical data center operations and to meet AI requests, they consume larger amounts of water to cool their processor chips, so as to avoid overheating and potential damage. Similarly, as energy use increases in data centers, so do carbon emissions.
A medium-sized data center can consume up to roughly 110 million gallons of water per year for cooling purposes, equivalent to the annual water usage of approximately 1,000 households. Larger data centers can each “drink” up to 5 million gallons per day, or about 1.8 billion annually, usage equivalent to a town of 10,000 to 50,000 people. Together, the nation’s 5,426 data centers consume billions of gallons of water annually. One report estimated that U.S. data centers consume 449 million gallons of water per day and 163.7 billion gallons annually (as of 2021). A 2016 report found that fewer than one-third of data center operators track water consumption. Water consumption is expected to continue increasing as data centers grow in number, size, and complexity.
According to scientists at the University of California, Riverside, each 100-word AI prompt is estimated to use roughly one bottle of water (or 519 milliliters). This may not sound like much, but billions of AI users worldwide enter prompts into systems like ChatGPT every minute. Large language models require many energy-intensive calculations, necessitating liquid cooling systems.
The Water Cycle of Data Centers
A data center’s water footprint is calculated as the sum of three categories: on-site water usage, water use by power plant facilities that supply power to data centers, and water consumption during the manufacturing process of processor chips. Water can come from various sources, including blue sources (e.g., surface water and groundwater), piped sources such as municipal water, and gray sources (e.g., purified reclaimed water). Using recycled or non-potable water to meet a data center’s cooling needs is a well-established practice to conserve limited potable water resources, particularly in dry or drought-prone areas.
In the context of data centers, “water consumption” refers to the amount of water withdrawn from blue or gray sources minus the water discharged by the centers (primarily warm water left over from cooling the IT racks). The consumed water is generally the water that evaporates or is otherwise taken out of immediate human usage. Withdrawal of fresh water from local streams or underground aquifers may lead to aquifer exhaustion, particularly in water-stressed areas.
Researchers at The Green Grid, a nonprofit industry consortium, developed a metric called Water Usage Effectiveness (WUE) to measure water usage by data centers. Similar to the Power Usage Effectiveness (PUE) metric, which measures the energy efficiency of a data center, the WUE metric assesses the efficiency of a data center’s water use. WUE is reported in liters per kilowatt-hour (kWh): a data center’s total water consumption, measured in liters, is divided by the total energy consumed by that data center in kilowatt-hours in the same time period. While “0” is the ideal WUE score, this can only be achieved in air-cooled data centers, and most data centers cannot meet this target due to their location’s climate conditions. The average WUE across data centers is 1.9 liters per kWh, which is a great goal to beat.
Data centers' water usage depends on various factors, including location, climate, water availability, size, and IT rack chip densities. In hotter climates, like in the southwest United States, data centers need to use more water to cool the building and equipment. With the increasing number of centers supporting AI requests, chip density is also growing, which leads to higher room temperatures, necessitating the use of more water chillers at the server level to maintain cool temperatures. Most data centers use a combination of chillers and on-site cooling towers to avoid chip overheating.
Cooling data centers is a complex operation. At the server level, water chillers cool IT rooms to maintain optimal temperatures and prevent damage to chips. This can be achieved through air cooling using water evaporation, which is an open-loop and more water-intensive method, or through server liquid cooling. Server cooling is a more expensive approach that delivers the liquid coolant directly to the graphics processing units (GPUs) and central processing units (CPUs). Direct-to-chip liquid cooling and immersive liquid cooling are two standard server liquid cooling technologies that dissipate heat while significantly reducing water consumption. During immersive cooling, water or specialized synthetic liquids flood the chips, absorbing the heat. The difference between direct server liquid cooling and air cooling through evaporation can be compared to the difference between drip irrigation and flooding in agriculture.
In areas with limited water availability, server liquid cooling is the best choice, as it requires minimal water consumption. Conversely, in areas with a strained power grid, an evaporative air cooling tower is a suitable building design, as it requires minimal power usage.
Regardless of the approach chosen, a heat exchanger is necessary to capture the hot air or hot water produced as a byproduct of the cooling process. Hot water coming from the servers is cooled by water from either the air-cooled chiller or a cooling tower. Likewise, hot air is exchanged with cooler air. A heat exchanger transfers heat from the server room to the building's cooling system.
Approximately 80% of the water (typically freshwater) withdrawn by data centers evaporates, with the remaining water discharged to municipal wastewater facilities. The large volume of wastewater from data centers may overwhelm existing local facilities, which were not designed to handle such a high volume.
Besides on-site water consumption, a significant portion of data center water usage originates from the power facilities where they obtain their energy. Because 56% of the electricity used to power data centers nationwide comes from fossil fuels, a significant portion of data center water consumption is derived from steam-generating power plants. Fossil fuel power plants rely on large boilers filled with water that is superheated by natural gas or coal to produce steam, which in turn rotates a turbine and generates electricity. Water withdrawals from these power plants are a significant source of water stress, particularly in drought-prone areas and in the summer, when water levels are lower and electricity demands are higher.
A federal report estimated that the indirect water consumption footprint (from electricity use) of data centers in the United States was roughly 211 billion gallons in 2023. Given that 176 terawatt-hours (TWh) of electricity were consumed by data centers in 2023, the centers’ indirect water consumption can be estimated at 1.2 gallons per kWh on average nationally in 2023. As data centers are expected to consume up to 1,050 TWh annually by 2030, water usage will increase in parallel.
Chip and server manufacturing are significant sources of water consumption for data centers. Semiconductors and computer chips are integral to data center processing. Each server in a data center contains multiple CPUs, GPUs, and memory chips. Larger data centers and those that support AI requests can contain tens of thousands of servers, each with multiple chips. Ultrapure water is ideal for cleaning, etching, and rinsing chips during the manufacturing process. Creating ultrapure water is a highly water-intensive process, requiring approximately 1,500 gallons of piped water to produce 1,000 gallons of ultrapure water. An average chip manufacturing facility consumes approximately 10 million gallons of ultrapure water per day. A single chip installed in a data center has already consumed thousands of gallons of water by the time it reaches the site.
Water-cooled high computing systems in a data center. Credit: ECMWF Data Center.
Water Impacts in Nearby Communities
The water consumption of the 5,426 data centers nationwide is already impacting local communities. Northern Virginia is considered the world capital for data centers, with over 300 operational data centers spread across four counties: Fairfax, Loudoun, Prince William, and Fauquier. Collectively, all data centers in Northern Virginia consumed close to 2 billion gallons of water in 2023, a 63% increase from 2019. Loudoun County, with approximately 200 operational data centers, used around 900 million gallons of water in 2023. This has led Loudoun Water, the county's water authority, to rely heavily on potable water for data centers rather than reclaimed water.
Making Data Centers More Water-Efficient
Data center developers' most common choice is to withdraw water from blue sources and employ water-intensive practices, such as air cooling through water evaporation. However, there are other options. To make a more sustainable choice for nearby communities and ecosystems, developers can instead use innovative water management techniques to reduce water consumption, including closed-loop cooling systems, immersion cooling, air cooling, and using non-potable water sources (e.g., recycled wastewater and captured water).
Closed-loop cooling systems enable the reuse of both recycled wastewater and freshwater, allowing water supplies to be used multiple times. A cooling tower can use external air to cool the heated water, allowing it to return to its original temperature. These systems can reduce freshwater use by up to 70%.
Free cooling is a method where outside cold air is drawn into the data center to cool the equipment. Data centers must be located in cooler climates for this strategy to be effective.
Air cooling involves air conditioning vents and tubes that remove heat generated by chips as they process data and AI requests. This method is most effective in areas where electricity is cheaper and water resources are limited.
Immersion cooling in data centers involves bathing servers, chips, and other components in a specialized dielectric (or non-conductive) fluid. Hardware is submerged in specially designed tanks filled with the coolant. The non-conductive liquid absorbs the heat from the chips and transfers it to a heat exchanger, where it is cooled down before flowing back into the tank. Immersion cooling is a novel process that entails higher upfront costs than conventional direct liquid cooling, but provides significant energy savings and space-optimization benefits for data center developers. Since the technology uses synthetic fluids, it requires significantly less water than other approaches.
Powering data centers with renewable energy sources, like solar or wind, requires significantly less water consumption than obtaining energy from fossil fuel power plants. With approximately 56% of the electricity used to power data centers nationwide coming from fossil fuels, deploying more clean energy to power these facilities can significantly reduce water consumption. Coal plants are the most water-intensive facilities, requiring approximately 19,185 gallons of water per megawatt-hour (MWh) of power generation. Natural gas power plants consume approximately 2,800 gallons per MWh. In 2022, 40% of all total U.S. annual water withdrawals, or about 48.5 trillion gallons, were made by coal and gas power plants. Of those 48.5 trillion gallons, 962 billion gallons of water were consumed and were no longer available for direct downstream use. Meanwhile, rooftop solar panels and wind turbines do not need any cooling water, and they are not a steam-based energy technology like coal and natural gas.
If the United States moves toward 100% renewable energy generation and the retirement of fossil fuel plants, the water savings would be enormous, with billions of gallons of water saved, and more freshwater would be available for both human consumption and natural ecosystems.
Author: Miguel Yañez-Barnuevo
Sign up!
Sign up for our free newsletters, publications, and briefing notices!
EESI does not sell, share, give, or trade e-mail addresses, and readers can unsubscribe at any time. View our full privacy policy here.