Hot planet, cool servers
Picture this: Greta Thunberg and a Generative Artificial Intelligence (AI) model sitting in a room together.
I like to imagine that Greta would say something along these lines: “How dare you?”.
To which the model would probably reply with: “Dare what, girl?”
It isn’t news that global warming is a serious problem that leads to human-caused climate change, having a wide range of natural, social, economic and territorial consequences. However, we seem to be quite good at overlooking this crisis, despite the fact that there is a clear increase in the global temperature, with 2024 being the warmest year since 1850 (when the records began).
One of the main causes of global warming is generating power from fossil fuels, because, by burning these fossil fuels, we produce greenhouse gases which are trapped by the Earth’s atmosphere. While carbon dioxide is not a bad thing — it is essential for survival — the excess of it is what is disturbing, because we are releasing all these gases at a pace that exceeds Earth’s ability to absorb them again.
This gives us more insight as to why Greta is in the same room as a Generative AI model: energy.
It seems that generative AI models have not only introduced an existential threat — the race for Artificial General Intelligence (AGI) — but also an essential one: survival. Generative AI models require extensive energy for training and, of course, also require power for running. Although there are renewable energy resources, like solar and wind, most of the used power comes from fossil fuels. The wish for better performance, driven by the desire for profit, is pushing the development of bigger models, which require even more energy for training.
But energy is not the only piece in the puzzle of global warming. Other factors like raw material extraction, manufacturing, transportation, usage and end-of-life can prove equally relevant to the bigger picture.
GPUs (Graphics Processing Units) are electronic circuits that are used for AI training, because of their ability to handle complex computations at a high speed. Among the main materials used for building GPUs are silicon and gold. One of them is a widespread material (silicon), while the other one is sparse (gold). Silicon can be found easily in nature, however, only after purifying it through energy intensive methods, does it become useful in GPU production. These methods have a significant carbon footprint. Similarly, gold extraction leads to heavy metal waste, obtained when separating the gold from the ore. These heavy metals are dangerous not only for wildlife, but also for humans.
The environmental effects of producing chips extend beyond the extraction and manufacturing process. The parts of the chip travel long distances, before reaching the final customer. Additionally, it is predicted that this industry will lead to an increase of e-waste (discarded electrical or electronic equipment) of 1.2 - 5.0 million tons by 2030. This accounts not only for GPUs, but also for CPUs (Central Processing Units), storage devices and memory modules. Why is this a problem? These parts are constantly being replaced with the newest, best version. Additionally, privacy concerns lead to destroying equipment, in order to prevent data leakage, rather than reusing or recycling them. While 1.2 - 5.0 million tons out of the total of over 60 million tons does not seem like much, this figure is just a projection. As technology continues to advance, the impact could be far greater than expected, which is why we need to move forward with caution.
“But why should we care?” I ask, interrupting the heated discussion.
Well, we are incorporating generative AI more and more into our lives. When you look something up on Google, you sometimes don’t even have to look for the information yourself: an AI model will answer you instead. As we are starting to make AI indispensable to our lives, we need to understand what dangers it can pose to our survival as a species.
Greta adds: “Will AIs’ ability to find solutions for humanity justify the cost we have to pay?”
As a rule of thumb, a bigger model is a better model, due to the scaling laws. Ideally, we would want a model that has a very good performance, without using too many resources. However, it is more realistic that as models continue to grow, they will keep using more and more resources, threatening our existence by polluting Earth, unless we take action. One might argue that as this happens, there might be valuable discoveries being made with the help of generative AI, but even in this scenario, the rate of pollution might outpace the rate of innovation. Looking at this issue through this lens becomes critical as more and more datacentres are being opened: OpenAI is opening five new data centres and there are countless others being opened in America alone.
The problem right now is not whether to use AI anymore, but rather how. How can we use AI to solve these very real problems that are likely to keep on growing as technology progresses? We need incentives to develop more efficient, rather than bigger models, to use more renewable sources of energy and we need transparency from companies. We might be the generation to experience the very real effects of our society’s lack of action. For this reason, we need to regulate its development, before the cost becomes irreversible.
Which cost are we prepared to pay? That of convenience or that of inaction?


Your question about whether the rate of innovation can outpace the rate of pollution is the most important one here. The scaling laws assumtion that bigger models are always better creates a fundamental conflict with environmental sustainabillity. What's particularly concerning is how the projected 1.2 to 5.0 million tons of AI related e waste by 2030 compounds the energy consumption problem. The incentive structure needs to shift toward efficiency rather than scale, but that requires systemic change beyond just individual company choices.