Why Saying “Please” to ChatGPT Might Be Costing OpenAI Millions

Uni24.co.za

   
Crypto NewsStudent ReadsEditor's Pick
Online CoursesBursaries for May 2025Uni Application Guides

Why Saying “Please” to ChatGPT Might Be Costing OpenAI Millions

OpenAI’s CEO reveals how human politeness is driving unexpected expenses — and sparking deeper conversations about our relationship with AI.


A Price on Politeness: ChatGPT’s Courteous Users Are Costing Big

In the age of artificial intelligence, every word matters — even the kind ones. According to OpenAI CEO Sam Altman, the seemingly harmless act of saying “please” and “thank you” to ChatGPT is costing the company tens of millions of dollars.

Altman responded to a light-hearted query on X (formerly Twitter) on April 16, noting:

“Tens of millions of dollars well spent — you never know.”

While his remark carried a touch of humor, it also opened the door to a much larger conversation about how humans engage emotionally and ethically with machines.


Why Are People So Polite to AI?

The viral discussion quickly revealed an intriguing divide: many users aren’t just being polite out of habit — they’re doing it with intent.

See Also  Coinbase Adds MORPHO, PENGU, and POPCAT to Its Listing Roadmap—Prices Surge Up to 19%

Some believe in playing it safe in case AI ever becomes sentient.

“It might remember how we treated it,” one commenter wrote.

Others, like software engineer Carl Youngblood, view AI courtesy as a form of personal growth:

Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy.

This sentiment aligns with findings from a December 2024 survey by Future, which found that 67% of Americans are polite to AI assistants. Of those, 55% said it’s simply the right thing to do, while 12% admitted they’re afraid mistreating AI could backfire someday.


The Hidden Costs of AI Interaction

While human kindness is welcome, it does come with a bill. Every query to ChatGPT consumes energy, and more words mean more processing power.

Source: Sam Altman

A 2023 study by Digiconomist founder Alex de Vries estimated that a single ChatGPT query uses around 3 watt-hours of electricity. However, Josh You, a data analyst at Epoch AI, disputed that number, suggesting a much lower average of 0.3 watt-hours, citing improved AI efficiency.

See Also  Bitcoin Shows Growing Resilience Amid Market Downturn, Signals Shift in Behavior

Still, questions remain. One user on X wondered:

“Why can’t OpenAI optimize the model to ignore or compress polite words like please and thank you?”

While the company hasn’t provided a technical solution for this yet, Altman recently stated that the cost of AI output has been decreasing tenfold annually as models and infrastructure improve.


OpenAI’s Road to Profitability Still Long Despite Explosive Growth

Even with tens of millions lost to politeness, OpenAI is on a fast track toward massive growth. The company expects to triple revenue this year to $12.7 billion, driven by ChatGPT’s popularity and widespread enterprise adoption.

However, OpenAI does not expect to be cash-flow positive until 2029, when it aims to hit $125 billion in annual revenue. In the meantime, it faces growing competition from companies like China’s DeepSeek, which are rapidly advancing their own large language models.

Share This
Join the Rhapsody Prayer Network
Join the Rhapsody Influencer Network
Prayer of Salvation
Read Today's Rhapsody

 

Read rhapsody of realities daily devotional

Rhapsody of Realities is a life guide that brings you a fresh perspective from God’s Word every day. It features the day’s topic, a theme scripture, the day’s message, the daily confession and the Bible reading plan segment. It is God's Love Letter to You!