Green Tech Will Have a Hard Time Keeping Up with AI’s Insatiable Power Appetite

As a key driver of the Fourth Industrial Revolution, Artificial Intelligence is often referred to as the “new electricity.” AI truly is giving new power (literally) to every industry imaginable. Behind this lies an energy cost many people do not see, and a clash between AI and green energy. While society dreams of the clean energy future that Green tech promises, AI is running in the opposite direction by burning lots of power.

A study from the University of Massachusetts Amherst found that training a single large transformer-based model could emit 625,155lbs of CO₂, equivalent to flying 300+ passengers from New York to San Francisco. ChatGPT alone generates over 573,000 pounds of CO₂ every month. AI is a threat to Green energy, and it does not help that the cost of copper, used in electricity wires and in holding renewable energy infrastructure together, has increased in the past few years, partly due to AI.

Regardless of the limitations, Green tech’s promise of a clean energy future still stands. The question is, can it happen fast enough to combat the threat AI’s energy-hungry nature poses to the environment? 

Why AI Consumes so Much Power

AI consuming power (Artwork)

Unlike humans, AI does not think or rest. A good place to start understanding why it is such a power guzzler lies in how large models like ChatGPT-4, Llama 3, or Gemini Ultra are trained. Training AI models is not a one-time computation.

On the contrary, it is a weeks or months-long marathon of parallel processing on GPUs and TPUs, with each processor drawing megawatts of power in real time. Some trainings use as much electricity as small countries like Dominica, Tonga, and the Cook Islands. And that is just training. After training, these models go live and perform inferences every day, from answering questions and generating responses to analyzing inputs.

Now imagine millions of people using AI tools. It’s not just ChatGPT or Gemini for general AI tasks. It’s also specialized tools for coding, image generation, or entertainment. Most people are unaware that AI companion services like Candy AI or Nomi also run large language models in the background. Same with any other AI service. Now scale that across multiple apps and platforms. These tools do it all day, every day, racking up a massive trail of power consumption behind them. 

Another reason AI consumes so much power is in data centers, where AI models train and run. Since these models run continuously, data centers run continuously too and consume an enormous amount of electricity, as they are entire campuses of servers that require switches, cooling systems, and backup units to function. Every part of this system pulses with heavy power. 

Now, we live in the middle of an AI revolution, with more and more AI tools and models being built. This means that there is a rising demand for more data centers and data center expansions. Even more power. And when one thinks about how these facilities are built in fossil-heavy grids, such as some parts of the U.S., China, or India, it becomes undeniably clear that there is pressure on the global power sector thanks to AI.

Can Green Tech Keep Up?

Green technology is making admirable strides, and global renewable electricity generation is growing. In 2024, 15.1% of global electricity was supplied by solar and wind. However, this stride can still not match AI’s rapid growth rate. Renewables are seasonal, dependent on weather, and geographically inconsistent. The sun, for instance, cannot shine at night and the wind does not always blow. Hence, they cannot supply uninterrupted power.

This makes them naturally unsuitable to carry the high and never-ending power demands of AI. Then we have a storage limitation. Due to the intermittent nature of renewable power sources, storage would have been the perfect go-to. But there is no breakthrough storage solution yet, as battery tech as it is currently cannot hold power at the level AI requires.

There are data centers that source a considerable amount of their power from wind, solar, and other low-carbon sustainable energy sources, but they do not operate on renewables alone and usually rely on traditional energy as backup. Then we come to scaling. Scaling up green projects not only takes money, but also time and space. For instance, large solar farms or wind installations can take years to get through approval, construction, and connection to the grid.

These installations require land and raw materials, some of which are in short supply. Many of these raw materials, such as lithium, cobalt, copper, steel, and rare earth elements needed for a green energy transition, face high costs or supply shortages. Let’s take copper as a telling example. It is essential for AI data centers and the creation and operation of AI systems.

Beyond AI, copper is also vital for almost every part of the green energy ecosystem, from electric grid upgrades to solar inverters to wind turbines. But there is a strain on that chain of supply, as copper prices have spiked to record highs amid concerns that the demand for it would outweigh the supply within the next decade, according to the International Energy Agency.

The Gap Between AI Growth and Green Tech’s Capacity

Many AI data centers are built where land and power are cheap and available, not necessarily where renewables are abundant. In Virginia, where AI infrastructure is booming, much of the electricity still comes from fossil fuels. It is difficult for these AI companies to commit to using clean power when the grid around them may not even support it.

Another issue here is that infrastructure is also in the way. Renewable energy grids are not yet reliable or strong enough to handle the load that AI data centers bring, especially when those loads spike unexpectedly. Energy storage, as it is with its limits, is no help here.

power plant emitting smoke

When companies are racing to scale the capabilities of their AI models and tools, speed and cost often take priority over sustainability. Because it is still cheaper in many regions to burn fossil fuels than to invest in full renewable coverage, many companies are not incentivized to break from traditional energy sources.

Ironically, even the green energy solutions can have unintended environmental side effects. For instance, mining lithium for batteries or clearing land for solar farms come with their own ecological cost, as they can lead to biodiversity loss, water pollution and water depletion. Without solid policy support, it is hard to shift the system and achieve the dreamed transition to renewable energy.

Varying widely across countries, carbon pricing, incentives and regulations lag far behind both AI adoption and climate change efforts. Hence, there is a need for stronger regulation and better financial incentives so that the gap between green tech’s promises on clean energy and what fossil fuel already brings to AI’s table can be closed.

Attempts to Make AI More Energy-Efficient

It is not all bad news. There is a growing push within the tech space to build more energy-efficient AI systems. Chips and hardware like NVIDIA’s H100 GPUs, NVIDIA’s Grace CPU, Google Coral, Graphcore’s IPUs and Cerebras’ Wafer-Scale Engine were built to handle complex AI tasks with less energy. In addition to that, developers are finding ways to compress AI models without touching their performance. These compression techniques, like model pruning, quantization, and distillation, could cut power needs by more than 53%. 

On the green energy front, tech companies are trying to shift their data centers toward renewable energy. Google, for instance, has invested in “carbon-aware computing,” moving some of its movable compute tasks to times and places where green power is more available. Microsoft’s Azure data centers are also contributing to the cause with a commitment to use renewable energy and reach carbon-negative operations by 2030. 

Amid all of these efforts, the real answer to optimizing AI for energy efficiency is, in fact, AI. Machine learning is already being used to optimize power grids, predict renewable output, and improve battery storage management. If deployed wisely, AI can help smooth out the exact energy supply challenges it creates. Green tech also has a part to play, as solid-state batteries, small modular nuclear reactors, and carbon capture tech all offer potential lifelines. If combined with mandates for AI energy use reporting, R&D incentives, and public-private partnerships to speed up renewable development, stable green energy sources that could support AI at scale might not be far-fetched after all.

At the heart of this problem lies a deep irony. While humanity works toward a greener planet through sustainable efforts like renewable energy adoption, the planet itself creates obstacles that undermine these efforts. For example, weather variability and climate change negatively affect the infrastructure and resource availability that green energy relies on.

Natural events like storms damaging solar farms or droughts limiting hydropower definitely do not help green tech’s case. But as mentioned earlier, there is a solution in parallel innovation. It is in the hands of AI developers, energy providers, policymakers, and researchers to take action so that green energy can handle AI’s insatiable power appetite and our dependence on carbon energy can be further reduced. Otherwise, AI might just leave us with a bigger environmental problem than the many other problems it has solved.

Ashwin S

A cybersecurity enthusiast at heart with a passion for all things tech. Yet his creativity extends beyond the world of cybersecurity. With an innate love for design, he's always on the lookout for unique design concepts.