Mitigating the ESG risk of AI

Phahlani Mkhombo
MD: Genesis Corporate Solutions

The Paris Agreement of 2015 resulted from landmark meetings and agreements by UN member states to work towards carbon neutrality and Net Zero economies.

Businesses obviously have a major role in achieving this. Technology will play a massive role in improving the efficiencies that will necessitate the move towards carbon neutrality. Technology such as generative artificial intelligence allows companies to work smarter and become more accurate, both of which improve profitability.

However, recent research points out an ESG risk associated with AI. How do companies mitigate this?

Quantifying the risk

The HBR article points out that, by 2026, computing power dedicated to training AI is expected to increase tenfold. As more power is expended, more resources are needed. As a result, we’ve seen exponential increases in energy and, perhaps more unexpectedly, water consumption.

Some estimates even show running a large AI model generates more emissions over its lifetime than the average car. A recent report from Goldman Sachs found that by 2030, there will be a 160% increase in demand for power propelled by AI applications.

How do companies address this?

Make smart choices about AI models

The article points out that an AI model has three phases — training, tuning, and inferencing — and there are opportunities to be more sustainable at every phase. At the start of an AI journey, business leaders should consider choosing a foundation model rather than creating and training code from scratch. Compared to creating a new model, foundation models can be custom-tuned for specific purposes in a fraction of the time, with a fraction of the data and energy costs. This effectively “amortises” upfront training costs over a long lifetime of different uses.

It’s also important to choose the foundation model that is the right size. Most models have different options, with 3 billion, 8 billion, 20 billion, or more parameters. Bigger is not always better. A small model trained on high-quality, curated data can be more energy efficient and achieve the same results or better depending on your needs. IBM research has found that some models trained on specific and relevant data can perform on par with ones three to five times larger but perform faster and with less energy consumption. The good news for businesses is that it likely means lower costs and better outcomes.

Technology has improved the efficiency of doing business
Image By: Canva

Locate your processing thoughtfully

The article adds that, often, a hybrid cloud approach can help companies lower energy use by giving them flexibility about where processing takes place. With a hybrid approach, computing may happen in the cloud at data centres nearest the needs. Other times for security, regulatory or other purposes, computing may happen “on prem” — in physical servers owned by a company.

A hybrid approach can support sustainability in two ways. First, it can help you co-locate your data next to your processing, minimising the distance the data must travel and resulting in real energy savings over time. Second, it can let you choose processing locations with access to renewable power. For example, two data centres may offer similar performance for your needs, but one is surrounded by hydropower and the other by coal.

The article points out that only using the processing you need is important. Many organisations overprovision how much compute power is standing ready for their needs, but software already exists to do better. In one case of our own AI workloads, IBM was able to reduce the excess standby “headroom” from the equivalent of 23 to 13 graphics processing units (GPUs), significantly lowering energy usage and freeing up high-demand GPUs for other purposes—with zero reduction in performance.

Use the right infrastructure

The article points out that once you’ve chosen an AI model, about 90% of its life will be spent in inferencing mode, where data is run through it to make a prediction or solve a task. Naturally, most of a model’s carbon footprint occurs here also, so organisations must invest time and capital in making data processing as sustainable as possible.

AI runs most efficiently on processors that support very specific types of math. It is well known that AI runs better on GPUs than central processing units (CPUs), but neither was originally designed for AI. We are increasingly seeing new processor prototypes designed from scratch to run and train deep learning models faster and more efficiently. In some cases, these chips are 14 times more energy efficient.

Running multiple AI models is energy intensive
Image By: Canva

The article adds that energy-efficient processing is the absolute most important step because it reduces the need for water-based cooling and even for additional renewable power, which often incurs its own environmental costs.

Go open source

The article points out that being open means more eyes on the code, more minds on the problems, and more hands-on solutions. That level of transparent collaboration can have a huge impact. For example, the open-source Kepler project — free and available to all — helps developers estimate the energy consumption of their code as they build it, allowing them to build code that achieves their goals without ignoring the energy tradeoffs that will impact long-term costs and emissions.

Open source also means tapping the “wisdom of crowds” to improve existing AI models instead of using our energy grids to forever build new ones. These models will let resource-limited organisations pursue cost-effective innovation and reassure sceptical organisations with flexibility, safety, and trustworthiness.

The article adds that the largest open-source project in history — the internet — was originally used to share academic papers. Now, it underpins much our economy and society.

One step at a time

It is important to point out that technology will play an important role in companies’ futures. Therefore, while there is a risk associated with AI, companies need to manage this rather than avoid technology altogether.

The role of Chief Technology Officers will become more critical in the future.