Why please and thank you to ChatGPT costs tens of million of dollars

OpenAI CEO Sam Altman recently revealed that users being polite to ChatGPT is leading to a surprise increase in operational costs for the company. It started with a post on X, where a user wondered aloud: How much money OpenAI has lost in electricity costs from people saying please and thank you to their models?
Altman responded with a touch of humour and clarity tens of million of dollars well spent he wrote You never know.
While it may seem humorous on the surface there’s a serious explanation behind the figures Each time a user interacts with ChatGPT even for a short or polite message it triggers a full response from a powerful AI model, which requires significant computational power to generate language in real time.
According to Goldman Sachs Research the increased power consumption of AI models is driven by the complexity of the computations involved The primary driver of this increased power demand is the growing computational intensity of AI workloads particularly those related to generative AI and large language models notes the research.
These models require massive amounts of data processing storage and computation leading to a significant increase in energy consumption As a result the power demand from data centers is expected to grow substantially with AI representing around 19% of data center power demand by 2028.
The light-hearted yet revealing comment quickly went viral sparking a broader conversation about the real-world cost of interacting with AI models.