This New Chip Promises Environmentally Friendly AI—Here's Why That's Important

Chatbots are energy guzzlers

  • A new kind of chip could make AI better for the environment.
  • AI applications like ChatGPT use vast amounts of energy. 
  • AI also uses large quantities of water for cooling.
Green concept; a digital AI chip against a grassy green background.
Green AI tech.

Eugene Mymrin / Getty Images

Generative artificial intelligence (AI) is hurting the environment, but IBM researchers say they may have a solution. 

IBM has created an energy-efficient AI chip that emulates the human brain's functionality. Finding ways to make AI more eco-friendly is sorely needed, experts say. 

"Training models requires a tremendous amount of electricity, as models like ChatGPT are trained on clusters of tens of thousands of GPUs running nonstop for months at a time," Dmitry Shapiro, the CEO of YouAi, told Lifewire in an email interview. "Each time the model needs to be retrained, more electricity is required. Also, once the model is trained, as people use it to make predictions, it requires running the model 'inference.' This, too, consumes energy."

Energy Sucking AI

According to IBM, the cutting-edge chip has the potential to significantly enhance the efficiency of artificial intelligence while reducing power consumption for computers and smartphones.

"The fully integrated chip features 64 AIMC cores interconnected via an on-chip communication network," the IBM scientists wrote in their paper. "It also implements the digital activation functions and additional processing involved in individual convolutional layers and long short-term memory units."

If the chip works as promised, it could help reduce the energy waste caused by AI. Researchers from the University of Massachusetts found that the AI training process can generate approximately 626,000 pounds of carbon dioxide emissions, comparable to the environmental impact of nearly 300 round-trip flights between New York and San Francisco. This amount is almost five times the lifetime emissions produced by an average car.

The water usage for cooling data centers during the training and execution of AI models is also substantial. A recent study revealed that for every "conversation" consisting of 20-50 prompts and responses, ChatGPT consumes water equivalent to a 500-milliliter bottle.

"AI is flat-out computationally intensive," Crispin Cowan, a former computer science professor and current staff engineer at the tech company Tanium, said in an email interview. "The way it works is a whole lot of matrix computations, with millions of little coefficients to be computed. GPU cards are excellent at this, and when you look at a GPU card, the first thing you notice is the great big cooling fan mounted on top of the main chip. These devices run hard and hot, and that consumes a lot of power."

Reducing compute costs directly reduces the carbon footprint of the computation.

The Search to Save Energy

Companies are racing to find ways to reduce AI energy consumption. Manufacturers are using unique, energy-saving computer parts that are designed just for AI tasks, Shapiro said. They're also improving the underlying methods, using techniques like "pruning," which is like trimming a tree to eliminate unnecessary branches; this means getting rid of parts of the AI model that aren't needed. 

"Sometimes, instead of building a new AI from the ground up, they use a pre-existing model and just tweak the last bits, saving a lot of energy," he added. "They're also being smart about where they run these models, choosing places powered by wind or solar energy."

When training these AI models, AI companies are trying to be careful about the data they use, ensuring that only the most useful information is processed. 

"This is like studying for a test by focusing on the most important topics and skipping the irrelevant ones," he added. "To further lessen the impact, there's a push for sharing ready-to-use models, so not everyone has to 'train' their own from scratch, which is energy-intensive."

A gloved hand holding a microprocessor.
Microprocessor.

mailsonpignata / 500px / Getty Images

To reduce environmental burdens, the current trend is towards using specialized Large Language Models (LLMs) for AI that only have narrow knowledge, Cowan said. 

"It turns out that if you only care about one area, you can run a private LLM that works and is only about 1/10th the size of ChatGPT," he added. "Reducing compute costs directly reduces the carbon footprint of the computation."

Some AI researchers even publish the environmental footprint of their work, much like nutrition labels on food, to make everyone aware of the energy cost. 

AI might one day help reduce greenhouse gasses. The nonprofit Open Climate Fix wants to develop AI that supports how solar electricity flows to the energy grid. The technology uses satellite and weather data to forecast the amount of solar energy being transferred to the grid and maximize the amount of renewable energy transmitted. 

Was this page helpful?