Since AI models like ChatGPT and Claude became the latest investment fad in Silicon Valley, outside observers have worried about the broader consequences. They consume tons of electricity, they are trained on trillions of original works without their authors’ consent, and if the most unhinged hype guys are to be believed, they will create mass unemployment in every industry any day now. But for some reason, the water use of these products has become one of the most common criticisms.

A slew of articles and videos argue that the water consumption of all the data centers powering AI systems are a threat to the environment. It is true that data centers use some water. But there is a great deal of missing context.

Even in highly water-stressed areas, all data centers combined are a rounding error compared to the real water wasters: farmers, especially of livestock feed. Here’s a in . The authors spoke with a research scientist who estimated how much water it takes to use GPT-4, accounting for both direct cooling and the embedded water in all the facilities.

One 100-word email generated with this program, they figure, would use up about 519 milliliters of water. The first problem here is that in much of the country, water is plentiful. To a first approximation, east of the 98th meridian, which passes roughly from the border between North Dakota and Minnesota down to the southern tip of Texas, there is usually enough rain to farm with just natural precipitation.

To be fair, the.