When you think of mega carbon emitters around the world, do you think of the global airline industry, mining, steel manufacturing or smart tech? When you think of water-hungry sectors, do your thoughts stray to Coca-Cola or ChatGPT?

Maybe you plugged this question about resource-hungry sectors into GPT4 or Microsoft Bing and, if you did, then chances are you got a response something like this:

  • “Based on my search results, the impact of artificial intelligence (AI) and machine learning on power and water resources is still a topic of research and debate. However, I found that the aviation industry is one of the most energy-intensive industries, accounting for more than 2% of global greenhouse gas emissions…. On the other hand, AI has the potential to reduce energy consumption and carbon emissions in various industries, including aviation…. Therefore, while the aviation industry has a significant impact on power and water resources, AI has the potential to mitigate the environmental impact of various industries, including aviation.”

It’s a well-reasoned response, and not bad for the mere cents those seconds of computational power cost the debate. However, there is a bigger systemic cost which environmentalists, AI developers and researchers are only now starting to delve into and which – despite this AI’s cool and reasoned reply – is sparking concerns about the future of AI models and smart tech.

The not-so-hidden costs

Even though everyone can now dabble in AI using free applications such as Google’s Bard, Jasper and ChatGPT, there is still a financial cost associated with these technologies. An analysis by Dylan Patel and Afzal Ahmad from consulting firm SemiAnalysis estimates that keeping ChatGPT up and running each day costs around $694 444 in computer hardware costs. Considering that Chat runs on 28 936 GPU electronic circuits, Patel and Ahmad believe “the cost per query to be 0.36 cents”.

This, however, is not the only cost. That runs deeper, says GIBS' Professor Manoj Chiba, an expert in AI, predictive analytics and data science. This view is supported by the likes of Alex de Vries, the founder of research company Digiconomist, who notes that if every old-school Google search were replaced by an interaction on a generative AI model then Google’s total electricity consumption would surge to the point where the “worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland”. Currently Google’s AI technologies only account for around 10-15% of its total electricity consumption.

Estimated energy use per query: AI search vs old-school Google

In the years to come we’ll see more of these projections emerge, based on available data harvested following the energy-intensive "training stage" that generative AI needs to undergo. Already, the numbers are sufficient for Chiba to declare that insufficient attention is being paid to the issue of climate.

“To run these AI algorithms on the computing power that we have takes a lot of energy,” he explains. “Right now the world’s temperatures have increased by 1.5°C. That’s official. But to create advanced smart cities of the future, we are going to need AI, and that potentially means increasing the Earth’s temperature even more through power demand.”

For AI and machine-learning tools to be able to crunch the data needed to improve operational efficiencies, design new structures, and unlock innovative and futuristic thinking, Chiba says drawing in natural resources is inevitable. These machines must be kept cool, which requires water and cooling systems, as well as the power to run these. “All of this is contributing to heating the planet,” he acknowledges.

What are we talking?

Chiba’s is not the only local voice echoing these views. South African AI commentator Johan Steyn wrote in Business Day recently, “Each stride in advancing generative AI capabilities seemingly walks hand in hand with a rise in environmental ramifications, painting a complex picture of progress intertwined with ecological responsibility.”

Steyn shared insights from a recent study by Alexandra Luccioni, Sylvain Viguier and Anne-Laure Ligozat that the training which the popular ChatGPT large language model underwent before its launch “consumed 1 287 megawatt hours of electricity and generated 552 tonnes of carbon dioxide”. To put this into perspective, 1 287MWh of electricity would be able to power more than 400 households in the UK, while the US Environmental Protection Agency equates the carbon dioxide produced during this training phase to the electricity used by 107 US homes for a year.

Environmental impact of training four machine-learning models (2022)

While the study by Luccioni et al focused on a snapshot in time in the development of a generative AI, this energy draw will continue as ChatGPT’s latest permutation, GPT-4, is accessed to answer more questions, generate code or create art. As the researchers noted, “Large language models (LLMs) are among the biggest ML [machine-learning] models, spanning up to hundreds of billions of parameters, requiring millions of GPU hours [computational resources] to train, and emitting carbon in the process. As these models grow in size – which has been the trend in recent years – it is crucial to understand to also track the scope and evolution of their carbon footprint.”

Training an AI model vs real-world emissions (2022)

While Luccioni and her fellow researchers focused on power consumption and considerations like equipment manufacturing, they noted that additional environmental resources should also be considered to obtain a more accurate and ongoing tally of carbon emissions. These include the “large quantities of chemicals and minerals required … the significant quantities of ultra-pure water and energy needed to manufacture it … as well as the complex and carbon-intensive supply chains and transportation involved in shipping them around the world”.

Research is needed into each of these areas, and particularly around water – a critical resource that is already under pressure due to climate-related issues such as changing weather patterns, flooding and damage to wetlands.

Let’s talk about water

Towards the end of 2023, Fortune magazine reported that Microsoft’s water consumption patterns around the world had surged by 34% between 2021 and 2022 due to the increased demand for its AI tools. This increase, said Fortune, was equal to more than 2 500 Olympic sized swimming pools – all going towards cooling and maintaining the company’s suite of data centres. 

By 2030 the World Economic Forum believes the global AI market could be worth as much as $1.6 trillion, and with this growing demand the need for data centres will also increase – simultaneously causing water use to balloon and power needs to surge.

“This is something we need to be very careful about,” says Chiba, pointing out that there are already some innovative solutions emerging in countries such as Sweden and Finland, which are using their natural geographic advantage to provide solutions to these resource challenges. Not only are the five Nordic countries well-connected to existing fibre networks, but they are countries with large quantities of low-cost renewable energy, and cold climates, which reduce the energy demands of cooling these data enclaves.

As the Nordic countries power forward with this advantage, Chiba says they are “actually even rebuilding previous ‘ghost towns’ and turning them into economic spaces. They are leveraging the environment to keep these computers cool. This is why quite a bit of Bitcoin at the moment has been developed and mined out in Sweden, because the energy requirements aren’t as high due to the natural coldness.”

However, not every country has these natural and infrastructure advantages, notes Chiba. This is why a sober discussion about the use of key resources needs to be had now and greater attention paid to tracking and disclosing the environmental impact of an emerging AI world.

The double-edged AI sword

While the chorus of concern about the environmental impact of developing artificial intelligences (AIs) and machine-learning models is growing, there is an equally vociferous argument to be made that AI might actually help the world to optimise and improve its wasteful energy consumption patterns.

According to experts at DeepMind and Google, AI could analyse and improve its own systems and processes in order to reduce its own environmental impact. A recent report from the Boston Consulting Group believes AI could help mitigate between 5% and 10% of greenhouse gas emissions by 2030.

Similarly, a China-focused study noted that harnessing AI could help countries achieve their Sustainable Development Goal (SDG) targets and establish resource-efficient cities and systems that are better able to juggle various sources of renewable energy and power-production types. The Chinese researchers wrote that AI could “help predict errors and mitigate turbulences that may hinder the attainment of sustainable growth”. It could also guide countries and companies to expand their markets, boost consumer demand and promote “new revenue streams resulting in economic progress”.

This reinforces the economic potential that has made the likes of Goldman Sachs predict that using generative AI tools “could drive a 7% (or almost $7 trillion) increase in global GDP and lift profitability growth by 1.5 percentage points over a 10-year period”.

On a continent like Africa, however, which is dealing with significant SDG challenges and infrastructure deficits, there are already concerns around a widening gap between high-demand workers and those at risk of losing out to technology. As Investment Monitor’s Jason Mitchell noted, “Industry 4.0 faces myriad obstacles in Africa – a lack of funding and technical expertise, a shortage of hardware, expensive data charges, low levels of technological awareness, an inadequate ecosystem of digital solution providers and power cuts.”

The environmental downside to AI uptake 

  • By 2030 the World Economic Forum projects the global AI market could be worth as much as $1.6 trillion.
  • To meet this growing demand, AI developers and companies will need more electricity, water, chemicals and minerals to run, maintain, cool and connect these models.
  • Following the training development stage of generative AIs like ChatGPT and Boom, researchers are building up a picture of the resources necessary to run and sustain these technologies.
  • While AIs have the potential to help humans find new and innovative ways to improve their energy usage and to plan better, more efficient cities, these calculations will draw heavily on real-world resources.
  • Countries with natural geographic advantages and advanced infrastructures – like the Nordic countries – stand to gain.
  • Where does this leave continents like Africa as the race for AI impacts a climate-stressed world?

Related

Mental Health is a Leadership Imperative

Mental Health is a Leadership Imperative

Bridging the Divide

Bridging the Divide

Nurturing Purpose and People for Long-Term Transformation and Sustainability

Nurturing Purpose and People for Long-Term Transformation and Sustainability