The etiquette of ChatGPT: Should you say please and thank you to AI?

It used to cost nothing to be polite - but that's certainly no longer the case in the age of GenAI.

We politely asked ChatGPT to illustrate this article and here's what it came up with...
We politely asked ChatGPT to illustrate this article and here's what it came up with...

Sam Altman’s recent statement on the true cost of saying “please” and “thank you” to OpenAI (millions, apparently), feeds comfortably into a narrative where AI is consuming resources faster than it is driving innovation and growth in the global economy. But is this really true? Is AI an energy-guzzling drain on resources and should users be using it sparingly, limiting themselves to short, clipped sentences?

In April a curious X user (@tomieinlove) wrote: "I wonder how much money  OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?"

Sam Altman, chief executive officer of OpenAI, then replied: "Tens of millions of dollars well spent--you never know."

This short response embeds both the suggestion that politeness is an expense, and that this might not be such a bad thing after all.

"I'm feeling that the AI agenda is beginning to shift very much towards the ethical position,” said Tony Boobier, author of AI and the Future of the Public Sector: The Creation of Public Sector 4.0”. "Energy, of course, is one part of the ethical question. I think also that there's a lack of understanding generally about the impact of the true cost of AI, and in fact the true cost of technology.

"For example, I've just got back from Washington, DC and it was a cherry blossom time. And for the festival there were half a million people there, principally all taking photographs of the same cherry blossoms. And I thought a little bit about the impact of that just in terms of the cost of storage, the cost of sharing these images. I think it was symptomatic of a wider issue: of the lack of awareness of the cost of this technology which everybody seems to be using."

The environmental costs of niceness

It is becoming acknowledged that AI contains both the potential to sap the world dry of energy through ever-growing and powerful data centres, and the huge potential to improve resource management, accelerating innovation in energy distribution and harvesting. Acknowledging this potential, the International Energy Agency (IEA) has launched the “Energy for AI and AI for Energy” initiative exploring how AI can drive innovation and actually manage its own resource requirements.

Training Large Language Models (LLMs) is undoubtedly resource-intensive. GPT-3 is estimated to have consumed 1287 MWh of electricity and consumed 552 metric tons of CO2. Processing also requires energy and asking GPT-4o a question is estimated to consume roughly 0.3 watt-hours- just short of running a 10W lightbulb for 5 minutes. 

As more and more users query the systems and businesses design their own ad hoc AI systems, it’s easy to see how this apparently inconsequential sum can snowball. While data centres accounted for only 2% of global electricity demand in 2023, they are on a sharp trajectory to rise to 9% by 2026.

Optimising AI: Reducing the impact of LLMs

Aside from the innovation and better management of resources that AI itself can provide to help reduce its energy consumption, there are a number of switches already taking place that can drastically reduce the energy consumption of data centres powering AI. Some of these are: abandoning inefficient air cooling and opting for liquid cooling, reducing demand through data compression, infrastructure optimization, and edge computing. 

At the same time, the cost of powering GPUs and the efficiency per unit of computation are also falling, thanks to strategies such as optimised hardware allocation, which ensures that low-energy CPUS, instead of energy-intensive GPUS, are used for lighter workloads. In parallel, some LLMs are also being replaced by Small Language Models (SLMs), such as Microsoft’s Phi-4, Llama or Mistral, which are trained on smaller datasets and contain fewer parameters.

Chintan Mota, Director - Enterprise Technology at Wipro, explained: "We are seeing rapid developments in model efficiency (smaller, faster models like Llama 4 Scout), improved hardware (low-power ones like Nvidia's Grace Hopper chips), and optimized inference frameworks. Combine this with the shift towards clean energy (Google, Microsoft, AWS all making major datacenter investments), and we’ll very likely make this a non-issue in the very near future."

A greener future for GenAI

Finally, the IEA confirms that even in high-growth scenarios AI-related electricity demand will only “account for a relatively small share of total global electricity demand growth to 2030”.

In addition to this, power generation from green energy sources is rapidly increasing with an expected addition of 269 GW globally to annual renewable capacity by 2030 according to the IEA.  Solar PV and wind will account for 95% of all renewable capacity additions, while overall global renewable capacity will grow more than 5 520 GW by 2030, significantly increasing the share of green energy in the mix. This growth also signals an acceleration in sustainable energy share of more than 2.6x that of the six-year period preceding it. 

Not only will the energy required to power increasingly power-efficient Ais be cleaner, but reports indicate it will be cheaper too. The latest report by the International Renewable Energy Agency highlights that solar PV, wind and hydropower experienced the most significant price decreases in 2023, with the global average cost of electricity from solar PV falling 12%, offshore wind and hydropower by 7%, and onshore wind by 3%.

So the cost of powering those “pleases” and “thank yous” seems set to drop exponentially in just a matter of years. This brings us to the matter of weighing up the value of maintaining a basic level of politeness while interacting with AI which, after all, learns from imitation.

"Is there any advantage from being rude to AI?" asks Boobier. 

Manners maketh the human

“Courtesy,” says Angèl Martinez, Senior Marketing Executive at risk management firm, Achilles Group Ltd, “is deeply ingrained in human communication; it shapes expectations, reduces friction, and fosters trust—even when speaking to a machine. In a world where AI becomes ubiquitous, preserving human-like interactions may help ease user adoption and acceptance.”

This is particularly relevant in the Public Sector where Boobier highlights the importance of empathy in dealing with the public: “I think empathy is programmable and will be based on the reaction or behaviour which we will teach machines.”

Nigel Russell, CEO Xenergie Digital, raises concerns about the knock on effect that poor-quality interactions with AI could have on the development of AI-human mixed team dynamics: “We have seen how social media has affected the way we interact socially, so let’s say that we chose to speak transactionally to an AI team member- would that spill over into how we interact with humans on the team too? Would we become more machine-like and less human?

"In a seamless human-AI environment, the AI element needs to be fully integrated and this would mean communicating with it as we would with a team member. The future of high performing organizations will be driven by the power of human communications. AI really highlights up the need for better communication."

With systems that learn dynamically from user interactions, there is a real risk that AIs trained without politeness will adopt communication patterns that strip away warmth, transforming what could have been a virtuous cycle of pointless but pleasant politeness into cold, computational exchanges.

As Alberto Stecca, Founder and CEO of Silla Industries, an Italian e-mobility pioneer, puts it: "If expressing gratitude to an AI becomes a matter of cost, perhaps the true issue lies not in energy consumption, but in our conception of progress.

"Clean energy will inevitably arrive, servers will grow more efficient, yet the greater risk is that we ourselves may grow more barren. If courteous speech is perceived as wasteful, it may be humanity — rather than artificial intelligence — that is most in need of a software update."

As software providers continue to pursue more user-friendly and accessible tools, we must ask: is reducing human politeness to heartless information transfer truly a desirable outcome?

Have you got a story or insights to share? Get in touch and let us know. 

Follow Machine on XBlueSky and LinkedIn