Artificial Intelligence (AI) is becoming part of our everyday lives – from voice assistants on our phones (like Alexa and Siri) to tools helping doctors to analyse medical scans. But every innovation has a downside. Whilst robot takeovers and ‘deepfakes’ of politicians are dominating media coverage, there’s another much more immediate problem that isn’t being reported on.
AI platforms require an enormous amount of power to function – from ‘training a platform’ to keeping it up and running. But just how big is its environmental impact? And what does this mean for the future of technology? It’s time to Dish The Dirt…
THE ABC’S OF AI
So, to begin, here’s a super speedy overview of how it works… AI for dummies if you will (with us here at Homethings being the dummies). AI uses computer programs that process large amounts of data to recognise patterns and make decisions. For example, just like how your email account filters spam messages by spotting email patterns in what you usually open or delete, AI can learn to spot which medical biopsies need a closer look by learning from tens of thousands of previous biopsies and what action was taken (then learning the pattern).
With AI scoring higher in speed and accuracy (to use the above example) doctors can make quicker, better-informed decisions, leading to faster treatment and, ultimately, better patient outcomes. This can be useful across all fields – from farming weather forecasts to financial fraud detection -and even climate change analysis.
AI’S HIDDEN TOLL ON THE PLANET
Whilst this tech is quite literally revolutionary, there is one massive downside that’s already starting to take effect - its carbon footprint. The phase where a computer searches through data to spot patterns and learn what to look for (the ‘training phase’), can produce as much CO₂ as five cars per would over their entire lifespan. After the training phase, keeping these systems up and running uses a lot of electricity. All this means that AI data centres already contribute about 1% of global electricity use annually.
On top of this, to not overheat, these data centres need to be cooled constantly (think how hot a laptop or the back of a TV can get and multiply that by a LOT). This has always been the case for search engines like Google, but whilst you wouldn’t usually search 20 things on Google in a row - a 20-question back-and-forth with ChatGPT (a question-answering AI website) is relatively common. One average-length conversation with ChatGPT requires around 500ml of water to cool the system. And, with around 41,700 users of ChatGPT per minute, that adds up to a LOT of water which, with climate change leading to water scarcity on the horizon, really isn’t good news.
In short, AI’s resource needs are as vast as the benefits it promises and poses a real environmental challenge if we want to use it for good.
HOW TECH IS CLEANING UP IT’S ACT
The good news is that some companies are exploring more eco-friendly ways to develop AI. Companies like Microsoft and OpenAI have “Green AI” initiatives to minimise water and energy use by creating models that consume less power by refining how they’re built and using less energy-intensive methods. The issue of water usage is also being tackled. Microsoft, for example, is working towards being "water positive" by 2030, meaning it will restore more water than its data centres consume.
Whilst AI is helping us find ways to improve energy efficiency in many industries, its own energy requirements are a growing issue. When it comes to this new green tech, we should look for transparency, responsible practices, and accountability to make sure these technologies help, rather than harm, our environment.
Asking questions is going to be the make or break over the next few years: How is this going to be monitored in tech companies? Who should have access to it and how often? How can we make sure it’s not harming more than it helps? If we start asking the right questions now, we can help create a future where innovation and sustainability go hand in hand. It makes sense.