Tech Focus: Artificial Intelligence Consumes Energy Too

Jo De Boeck notes the growing list of Artificial Intelligence (AI) applications as they take up more and more of the resources on Earth to deliver anything from Amazon recommendations to deciphering a virus DNA.

In this guest blog, the EVP/CSO of imec shares his insights into how power-hungry AI can be and the need for an Energy label for AI.

Jo De Boeck is the Executive Vice President & Chief Strategy Officer of imec, an R&D and innovation hub in nanoelectronics and digital technologies, based in Belgium. Photo: imec.

Jo De Boeck is the Executive Vice President & Chief Strategy Officer of imec, an R&D and innovation hub in nanoelectronics and digital technologies, based in Belgium. Photo: imec.

AI has become more intelligent in recent years.

Fuelled by seemingly infinite computational power that is easily available on tap today, it now processes huge amounts of data to recognise patterns every day.

Yet, with each new “trick” that an AI can learn, whether this is differentiating a cat from a dog or getting a robotic arm to figure out a Rubik’s Cube, billions of calculations go into each task.

“We need to talk about an Energy Label for AI,” says Jo De Boeck, EVP & CSO, imec. Photo: imec.

“We need to talk about an Energy Label for AI,” says Jo De Boeck, EVP & CSO, imec. Photo: imec.

That takes a lot of energy to power up, adding to an already fraught situation with global warming.

Compare this to the human brain.

It can effortlessly complete many of these cognitive feats that AI is accomplishing and celebrated for today, while requiring a small fraction of the energy.

So, while AI can be useful to help save energy by optimising its use, there is energy needed to train an AI model as well. This energy is not insubstantial.

Can this inspire us to develop more energy-efficient AI systems?

Will they bring a net positive in efforts to prevent or reverse permanent environmental damage?

These are questions that creators and users of AI should be asking today.

Singapore is one of the most forward looking countries when it comes to AI. In its ambitions to be a global hub for the emerging technology, it has already set up an ethics foundation to guide how AI is used.

However, like many countries around the world, there is a need to look deeper into the impact that AI has on the environment.

After all, it does not exist in a vacuum.

Usage of electric power

When users get suggestions for shows they might like on their streaming service or e-commerce site, most don’t realise the energy consumption needed to make that happen.

Billions of operations are needed for processing the data at data centres.

All these computations together consume a tremendous amount of electric power.

Although data centres now heavily invest in renewable energy, a significant part will rely on fossil fuel power.

The popularity of AI applications clearly has a downside – the ecological cost.

To get a better understanding of the total footprint, we should take two factors into account: training and inference.

First, an AI model needs to be trained by a labelled dataset.

The ever-growing trend towards the use of bigger datasets for this training phase causes an explosive growth in energy consumption.

Researchers from the University of Massachusetts calculated that during the training of a model for natural language processing, 284 metric tons of carbon dioxide is emitted.

This is equivalent to the emission of five cars during their entire lifespan, including construction.

Some AI models developed by the tech giants – which are not reported in scientific literature – might be even orders of magnitude bigger.

Optimising the hardware

Recently, imec demonstrated an Analog Inference Accelerator, achieving 2,900 trillion operations per Joule, which is already 10 to 100 times more energy efficient compared to today’s digital accelerators, trading off efficiency with precision.

With these types of hardware innovations, it will become possible to directly process the data in battery-powered devices, including drones and vehicles, and to avoid transmission energy.

However, developing energy-efficient hardware is only one side of the solution.

Running inefficient algorithms on an energy-efficient accelerator will wipe out the hardware’s benefits.

Therefore, we should also develop energy-efficient algorithms.

This is not only necessary for on-device inference, but also to reduce the number of calculations during inference or training of AI algorithms in data centres.

For this, we can look at how we humans learn. If you are a good tennis player, learning how to play squash is only a small step.

Similarly, we can transfer an existing AI model trained in one domain to an adjacent one.

After training, we can further minimise the number of calculations by applying compression strategies.

The most appealing one is the technique of network pruning – we ‘prune out’ all the parameters that have little importance for the end result.

What remains is an algorithm that has the same functionality, but has become smaller, faster and more energy efficient.

With the help of this compression strategy, the number of calculations can already be reduced by 30 to 50 per cent.

Thereafter, more application-based techniques will help us to further improve the efficiency.

As such, we can already regain more than 90 per cent of power by just optimising the AI model, apart from the hardware considerations.

To create truly energy-efficient AI systems, we thus need an integrated approach that fine-tunes innovations in data usage, hardware, and software.

That said, consumers still have no way to figure out how ‘green’ the AI systems are that they are using every day.

This is where awareness can make a big difference.

Can consumers get an estimate of the carbon emission related to the use of a recommendation or image recognition algorithm, for example?

With this, they know better the carbon impact of a service using AI.

Policymakers have introduced energy labels for household appliances, vehicles, and buildings, nudging investments in more energy efficiency.

Introducing energy labels for AI-driven applications and systems could be the next big step they need to take.

Tags: , ,

Leave a Reply