Being too nice to ChatGPT is ruining the planet: Why you need to stop saying ‘please’ and ‘thank you’ to chatbots
Being nice is free… unless you’re talking to an AI. Then it costs tens of millions in electricity and a bit of the planet
In a world where ChatGPT is practically a household name and AI is embedded into every crevice of life, we’ve all started treating these chatbots like friends, as entities who deserve a ‘Please’ and ‘Thank you’ for every query asked and answered. But here's something to muse on: What if being polite to your AI is actually doing more harm than good? Turns out, those little pleasantries might be contributing to the climate crisis more than you think, according to Sam Altman, CEO of OpenAI.
AI and climate change
According to a study in late 2024, 67% of US respondents said they are consciously polite to their AI chatbots. More than half (55%) of them reported doing it because, well, it just feels like the right thing to do. 12% confessed they do it to “appease the algorithm in case of an AI uprising.” In humanity's defence, politeness is basically hardwired into our human DNA, but the OpenAI CEO has now pointed out that this simple act of being nice is actually costing tens of millions of dollars in electricity.
It all started with a post on X by user @tomieinlove, who asked, “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models.” Altman responded with a tongue-in-cheek reply: “Tens of millions of dollars well spent — you never know.”
If you’re scratching your head and wondering how a couple of simple words could possibly have such a massive impact, let’s break it down. AI models like ChatGPT aren’t just some cute program running on a laptop, they're hungry, electricity-guzzling beasts. Running these models requires significant computational power and guess what? All that energy has to come from somewhere.
Research by the University of California and The Washington Post found that generating a single 100-word email uses 0.14 kilowatt-hours of electricity, which is enough to power 14 LED lights for one whole hour. Additionally, data centres, which power AI services like ChatGPT, consume about 2% of the world’s electricity, and that number is going to grow as AI becomes more embedded in every corner of life.
So, what’s the takeaway here? Well, the next time you're asking your AI for help, consider how often you're interacting with it and whether it's really necessary. In a world where every click and query adds to the energy consumption, maybe it's time to think about how much this dependency can actually cost. Maybe next time, just write the email, article, WhatsApp birthday message yourself.
