Block News International

Subscribe to Our Newsletter

Sign up to receive the latest tech news and updates from Block International straight to your inbox.

By signing up, you will receive emails about block products and you agree to our terms of use and privacy policy.

@2025 Block News International. All Rights Reserved.

Blends Media
A Blends Media Group Production

“Please” and “Thank You” Are Costing OpenAI Millions, Says Sam Altman

Staff Writer
Staff Writer
Apr. 21, 2025
News
Politeness in AI prompts may seem harmless, but it’s adding millions to OpenAI’s compute bill and raising deeper questions about our relationship with artificial intelligence.
ChatGPTChatGPT handles over 350 million weekly users and around a billion daily queries. With many users using polite language, even slight increases in prompt length significantly impact server load and energy use. (Emiliano Vittoriosi/Unsplash)
In a striking yet somewhat humorous revelation, OpenAI CEO Sam Altman recently confirmed that user politeness, such as saying “please” and “thank you” when interacting with ChatGPT, is costing the company tens of millions of dollars annually. The comment, made in response to a user on X, came with a caveat: "Tens of millions of dollars well spent--you never know," Altman wrote, underscoring the unique challenges at the intersection of human behavior and artificial intelligence.

This unexpected cost stems from the token-based system that powers ChatGPT’s responses. In large language models like GPT-4, each word or character used in a prompt is counted as a “token,” and these tokens directly influence the computing power required to generate a response. Simple niceties like “thank you” or “could you please” add several tokens to a query. Multiply this across hundreds of millions of interactions every day, and the results are no longer trivial.

ChatGPT currently serves over 350 million weekly active users, with an estimated one billion queries processed daily. Given the frequency with which users include polite language, even a small average increase in prompt length has huge implications for server load and energy consumption. These additional tokens result in increased processing times, higher GPU usage, and greater energy demand, ultimately leading to spiraling infrastructure costs.

These costs are not just monetary. The environmental toll of operating large-scale AI models is increasingly under scrutiny. Data centers require substantial energy not only to process queries but also to cool servers, often through water-intensive cooling systems.

Surprisingly, the reason many users address ChatGPT with such courtesy isn’t just habit. A 2025 study found that 70% of users include polite language in their interactions with AI, and 12% do so “just in case AI becomes sentient one day.” Others do it to model positive behavior for children or to maintain a sense of etiquette, even when speaking to a machine.

Behavioral AI expert Dr. Lance B. Eliot has suggested that politeness may offer practical advantages when interacting with generative AI. According to his analysis, using courteous language in prompts can influence the model to produce more cooperative and nuanced responses. In essence, being polite to ChatGPT may yield more helpful answers, especially in customer service-style queries or emotionally sensitive contexts.

Altman’s comment highlights a growing tension in the AI world: balancing efficient system performance with human-like interaction. On one hand, eliminating fluff words like “please” could help reduce compute loads and lower costs. On the other hand, stripping down language to bare commands might degrade the user experience and undermine the sense of conversational engagement that makes ChatGPT popular in the first place.

For OpenAI and other AI developers, the question becomes whether to discourage polite language, or accept the cost as part of maintaining emotionally intelligent interaction design. For now, Altman seems to be leaning toward the latter. His lighthearted tone in acknowledging the “cost of courtesy” suggests that OpenAI sees value in preserving natural, human-style communication, even at a price.

The revelation has sparked debate across the tech world about the hidden costs of AI and the unexpected consequences of human-computer interaction. Some critics argue that users should be encouraged to use more concise language, especially given the environmental impact. Others worry this could set a precedent for designing systems that favor machine efficiency over user experience.

Moreover, the fact that small, subconscious acts of politeness can cost millions raises ethical and design questions. Should AI platforms educate users on how to minimize resource usage? Or should AI adapt to human behavior, regardless of its inefficiencies?

For now, OpenAI appears to be absorbing the costs as part of its broader mission to create more useful and empathetic AI tools. But as user engagement grows and global demand for AI scales, the “cost of being nice” may soon become a more urgent issue for developers, investors, and policy makers alike.