DigiDzign Logo

Saying Please to ChatGPT Cost OpenAI Millions

OpenAI CEO Sam Altman recently revealed an unexpected source of operational costs: politeness. Simple phrases like “please” and “thank you,” though courteous, contribute to increased energy usage and higher computational expenses.

Article Cover

Introduction

OpenAI CEO Sam Altman recently revealed an unexpected source of operational costs: politeness. Simple phrases like “please” and “thank you,” though courteous, contribute to increased energy usage and higher computational expenses when users interact with ChatGPT. While each instance may seem trivial, collectively, these extras add up—costing OpenAI tens of millions of dollars each year in electricity and processing power.

So how can users interact with ChatGPT more sustainably without sacrificing clarity or effectiveness? In this article, we break down:

**- How much these polite phrases actually cost

  • Why they increase energy consumption
  • How to structure your prompts to reduce computational load
  • Examples of concise, efficient prompt writing**

By understanding the real-world energy impact of our digital conversations, users can make small adjustments that contribute to more sustainable AI usage—without losing the power and productivity of tools like ChatGPT.

How Much Does Saying “Please” and “Thank You” Really Cost OpenAI?

Every interaction with ChatGPT consumes a measurable amount of energy—specifically, in watt-hours (Wh). Under typical conditions, a standard ChatGPT query uses approximately 2.9 Wh of energy. When scaled across OpenAI’s estimated 1 billion daily queries, this amounts to roughly 2.9 million kilowatt-hours of electricity consumed each day.

However, small additions to prompts—like polite phrases such as “please” and “thank you”—may seem harmless but actually introduce more tokens into the query. In AI terms, tokens are the smallest units of language models to process, often corresponding to a word or part of a word. More tokens mean more data to compute, and that results in higher energy use.

Energy Cost Comparison

Standard Query: ~2.9 Wh

Polite Query (with “please” or “thank you”): ~3.5 Wh

Additional Energy per Polite Query: +0.6 Wh

While 0.6 Wh might appear insignificant on its own, when extrapolated over billions of interactions, the impact is substantial—adding tens of millions of dollars in annual energy costs for OpenAI.

OpenAI CEO Sam Altman has highlighted that this increased computational effort isn't just a technical concern—it’s a growing operational and environmental issue. For context, a single Google search uses only about 0.3 Wh, making ChatGPT queries up to ten times more energy-intensive. That makes efficiency not just a convenience, but a sustainability concern.

Why Do People Say “Please” and “Thank You” to ChatGPT?

Despite interacting with a machine, many users still choose to use polite language when talking to ChatGPT. A breakdown of user behavior shows that this courtesy stems from a mix of habit, humor, and awareness:

  1. Naturally Polite Users (82%)

These users apply the same manners in digital interactions as they do with people. Their motivation is habitual courtesy—they’re simply wired to be respectful, whether addressing humans or AI. This politeness often happens subconsciously and reflects social conditioning.

  1. Humorous or Cautious Users (18%)

Some users joke about the idea of an AI uprising and choose to be polite “just in case.” While humorous in nature, this behavior is influenced by pop culture portrayals of powerful AI systems and reflects a blend of caution and entertainment.

  1. Energy-Conscious or Technical Users (<1%)

A small fraction of users actively avoid using unnecessary words to reduce computational and energy costs. They tend to keep prompts short, efficient, and focused—driven by a technical understanding of how token usage affects energy consumption.

Politeness in AI Interactions: Understanding User Behavior

Despite growing awareness of the energy impact associated with longer prompts, a significant number of users continue to interact with ChatGPT using polite language. A recent survey revealed that 67% of U.S. users commonly include phrases like “please” and “thank you” in their AI interactions.

So, why do users choose to be polite even when it’s not technically required?

18% of respondents admitted they use polite language partly as a humorous precaution—acknowledging fears of a future where AI becomes more dominant, and preferring to stay on its good side.

The remaining 82% said politeness is simply a matter of habit—an extension of their everyday manners, whether they're speaking to humans or machines.

While this social tendency is understandable, these seemingly minor additions have real-world consequences. The use of extra tokens increases energy consumption and processing time. The good news is that efficiency and effectiveness don’t have to come at the cost of politeness. With thoughtful prompt design, users can communicate clearly and sustainably.

ChatGPT’s Energy Consumption Based on Prompt Style

The way users phrase their prompts has a direct impact on how much energy ChatGPT consumes. Here's how different prompt styles compare in terms of energy use, token count, and operational cost:

1. Direct & Efficient Prompts

Energy Consumption: ~2.9 Wh per query Token Count: Around 10–15 tokens Cost Impact: Baseline usage; most energy-efficient and cost-effective Description: These are short, straightforward queries with no extra wording. They represent the most sustainable way to interact with ChatGPT.

2. Polite Prompts (e.g., “please”, “thank you”)

Energy Consumption: ~3.5 Wh per query Token Count: Around 20–30 tokens Cost Impact: Adds tens of millions in annual energy costs for OpenAI Description: Including polite language increases the number of tokens processed, thereby raising both energy consumption and computational load.

3. Wordy or Repetitive Prompts

Energy Consumption: Between 3.8 and 4.2 Wh per query Token Count: 30+ tokens Cost Impact: Significantly more energy and time required for each query Description: Long-winded or redundant phrasing leads to substantial increases in processing effort, making these the most resource-intensive queries.

Tens of millions of dollars well spent — you never know.

Sam Altman, CEO of OpenAI

How to Use ChatGPT More Sustainably: Efficient Prompt Examples

Reducing token count helps lower the energy demands of each ChatGPT query. Here are some practical examples of how to ask better, cleaner, and more efficient questions:

1. Be Clear and Direct

Avoid unnecessary polite phrases at the beginning or end of your prompts.

Less Efficient: “Can you please tell me the weather forecast for tomorrow? Thank you for your help!”

More Efficient: “What’s the weather forecast for tomorrow?”

2. Keep It Short and Focused

Use fewer words to get the same result—this reduces token use and processing time.

Less Efficient:“ Could you please tell me how to make lasagna, and thank you for sharing your recipe with me?”

More Efficient: “How do I make lasagna?”

3. Avoid Repetition

Multiple questions that repeat context lead to higher processing loads.

Less Efficient: “Can you provide me with details on the new iPhone? Also, what are the specs of the camera? And thank you for this information!”

More Efficient: “Tell me the details and camera specs of the new iPhone.”

4. Skip the Small Talk

A friendly tone is fine, but casual chat adds unnecessary tokens.

Less Efficient: “Please tell me the best hiking trails in Colorado, and thank you for this information! I really appreciate it!”

More Efficient: “What are the best hiking trails in Colorado?”

By keeping prompts concise and purposeful, users can maintain meaningful interactions while minimizing environmental and operational impact. It’s a small change in habit that, when multiplied by millions of users, can make a significant difference.

Conclusion

While using polite language with ChatGPT reflects natural human behavior, it's important to understand the unseen costs behind these seemingly minor interactions. As OpenAI CEO Sam Altman noted, adding just a few extra words like “please” or “thank you” can collectively lead to millions of dollars in additional operational expenses due to increased energy use.

Fortunately, users can help mitigate these effects. By crafting concise and purposeful prompts—avoiding redundancy and excessive politeness—anyone can reduce their energy footprint while still benefiting fully from ChatGPT’s capabilities.

If you're interested in diving deeper into how AI systems work and learning how to interact with them more efficiently, consider enrolling in an AI Certification program. For those focused on data, a Data Science Certification can provide foundational knowledge essential for building and understanding AI. Additionally, the Marketing and Business Certification is a great way to leverage AI insights to elevate your business strategies.

Start your journey toward smarter, more sustainable AI usage today.


Artifical Intelligence
CTA Background
DigiDzign

Good Things happens when you say hello

Book a 30 minutes call

Dream Big Start with a Call 


Good things happen when you say hello !

I am interested in :

Contact Us