"Turn off the lights and fans when you leave home." This is a phrase that is ubiquitous in our country. From television and radio to newspapers, this message has long been part of public awareness campaigns aimed at reducing electricity consumption.
But did anyone ever imagine that such a message would one day be applicable to something called artificial intelligence (AI)?
Did anyone think that people would one day say, "Skip the small talk and get straight to the point. Your unnecessary searches are wasting water. Do you know some people in the world are dying of thirst?"
You wouldn't leave the lights and fans on when you leave home, would you? Don't use AI unnecessarily either.
It's not just a waste of water, AI is also leading to increased electricity consumption, increased reliance on fossil fuels, and generation of e-waste. A large portion of this waste ends up in poor and underdeveloped countries.
But there are ways to solve this crisis. Researchers say that it is possible to make AI more environmentally friendly through sustainable management.
Why does using AI require so much water?
According to a BBC report published on July 15, 2025, training AI models require thousands of servers to be run simultaneously, meaning a significant amount of water is used to keep them cool.
Not just for training, AI is also asked various questions throughout the day. To handle this vast amount of work, data centres are used, and water is also needed to keep them cool.
Saule Ren, assistant professor of the University of California, Riverside, told the BBC that asking 10-50 questions to a mid-range model AI like the GPT 3 consumes nearly half a litre of water.
It is important to note that this water must be clean, otherwise there is a risk of bacterial contamination. Besides, the options to recycle this water are limited because most of the water evaporates during the process.
Humans or AI: Who needs more water?
According to a report published by the United Nations Environment Programme on October 31 last year, a quarter of the world's people lack access to clean drinking water.
Pengfei Li and his team from the University of California, Riverside conducted a study on the consumption of water by AI. One estimate from their research suggests that the amount of water used globally to operate AI technology could soon be six times higher than the total water usage of Denmark – a country with a population of nearly six million.
As of March 21 this year, the number of data centres in the US stood at 5,426, according to Statista. Germany holds the second-highest number of data centres globally, but has only 529 centres.
A Bloomberg report published in May, 2025 revealed that nearly two-thirds of the new data centres built or under construction in the US since 2022 are located in the areas that are already facing severe water shortages.
Veolia Water Technologies – an organisation that works for climate change – reports that a portion of the water needed for data centres comes from municipal or regional water companies.
General people also rely on the same water source, which means AI is a direct competitor to humans.
The use of fossil fuels is increasing
Earlier this year, a report by the Massachusetts Institute of Technology (MIT) on generative AI said that data centres across the world would consume close to 1.05 billion megawatt-hours of electricity by 2026.
If the projection holds true, data centres would become the world's fifth-largest consumers of electricity, ranking between Japan and Russia in terms of power consumption.
According to a report by the Bangladesh Power Development Board, the country generated nearly 96 million megawatt-hours of electricity in the 2023-24 fiscal year. However, data centres alone are projected to use about 11 times more electricity by 2026.
A report by the United Nations Environment Program says that a significant portion of electricity powering AI comes from fossil fuels, which generate greenhouse gases. As a result, the pace of climate change is increasing.
Noman Bashir, an artificial intelligence expert at MIT, said, "The demand created by new data centres can't be met in a sustainable way. The rapid pace at which tech companies are building new data centres means that a large portion of the electricity required to run these centres is being sourced from fossil fuels."
Rich countries' increased e-waste is affecting developing countries
A report of MIT Technology Review published on October 28, 2024, stated that one of major causes of excessive waste by AI companies is the rapid advancement of hardware technology.
Characteristically, computer equipment has a lifespan of two to five years, but with the release of new versions, they are quickly replaced. As a result, the older equipment is discarded within a short period of time. This creates additional strain on the environment.
According to market research firm TechInsights, the world's three largest chip makers – Nvidia, AMD and Intel – supplied a total of 3.85 million graphics processing units (GPUs) to data centres in 2023. In 2022, this number was nearly 2.67 million. This figure continues to rise every year.
A study of 2023 on e-waste recycling found that generally e-waste is generated in rich and middle-income countries, including the US, Western Europe, Japan, South Korea, and Australia. Later, this waste is shipped to low-income countries and regions, such as China, India, Brazil, and several African countries.
According to a report of Harvard Business Review published in July, 2024, Google operated its Finnish data centres on 97 per cent carbon-free fuel in 2022, but the rate for its Asian data centres was only between 4 to 18 per cent.
A report of Investopedia published in February, 2025, said the US is the world's second-largest carbon emitter, and also home to the largest number of data centres. In other words, they seem to have mastered every possible way to pollute the environment. However, the consequences of their activities are borne by underdeveloped, and developing countries.
What is the solution?
In an interview with the Yale School of the Environment, Yale University's Associate Professor Yuan Yao said, "For assessing the environmental impact of AI, a transparent and standardised method is required. Without adequate quantitative data, it is not possible to understand the severity of this issue or address it effectively."
In a Forbes' report, Gavita Regunath, chief AI officer at data science consultancy Advancing Analytics, said, "A practical strategy to reduce AI training's harmful impact on the environment could be to focus on smart energy use. For example, in software, AI model training could be scheduled during off-peak hours when renewable energy is more readily available.
"Additionally, picking for smaller, less energy-intensive AI models instead of always running large ones can significantly cut electricity consumption. In terms of hardware, emphasising the use of energy-efficient chips, and eco-friendly infrastructure can also be far more effective," she added.
A report of the International Energy Agency (IEA) stated that making a single request to an AI tool like ChatGPT consumes nearly 10 times more electricity than a Google search. Given this situation, it's time for tech companies, and the world's polluting countries to put their heads together to figure out a responsible way to use AI.
Comments