The popular chatbots OpenAI ChatGPT and Google Bard are very energy intensive and require enough servers to train powerful programs. Cooling these data centers makes AI chatbots incredibly thirsty. Studies show that GPT-3 training alone consumed 700,000 liters of water.

According to the study, the average user’s conversation with ChatGPT is basically similar to throwing a big bottle of fresh water to ground. Given the chatbot’s unprecedented popularity, scientists fear it could upset the balance of the world’s water supplies, especially amid a historic drought in the United States and looming environmental uncertainty.

According to researchers at the University of Colorado Riverside and the University of Texas at Arlington, training the GPT-3 requires an amount of pure fresh water equal to the amount needed to fill a nuclear reactor’s cooling tower.

OpenAI has not disclosed the time it takes to train GPT-3, making it more difficult for researchers to estimate. At the same time, Open AI’s partner company Microsoft has built supercomputers for artificial intelligence training, and the latest model includes 10,000 graphics cards and over 285,000 processor cores. This gives an idea of ​​the enormous energy requirements of the operation behind artificial intelligence. In other words, half a liter of drinking water is needed for one conversation with Chat GPT.

According to the researchers, the water consumption of computer parks will become bigger and bigger with each new AI chatbot model.

Source: Gizmodo