OpenAI unveils ChatGPT API at very low prices

The company says it has reduced costs for the AI-powered chatbot by 90% since December.

Chat with MarTechBot

OpenAI has released APIs for ChatGPT and Whisper (which does speech-to-text conversion) at a price the company says is 10 times cheaper than its existing models.

“ChatGPT and Whisper models are now available on our API, giving developers access to cutting-edge language (not just chat!) and speech-to-text capabilities,” the company said in a blog post. “Through a series of system-wide optimizations, we’ve achieved 90% cost reduction for ChatGPT since December; we’re now passing through those savings to API users. Developers can now use our open-source Whisper large-v2 model in the API with much faster and cost-effective results.” 

Why we care. Chatbots for all! That’s good news for marketers, right? If OpenAI did reduce the cost of their product by 90% in less than three months, they did something every bit as revolutionary as the creation of their AI. It is also possible they’ve cut prices and are offering a loss-leader to ward off potential competitors. Whether or not that’s the case, it’s important to keep an eye on usage. Even at these prices a lot of users can quickly add up to a substantial cost.

Dig deeper: FTC warns tech companies about over-hyping AI claims

The cost. The ChatGPT API costs $0.002 per 1,000 tokens, which are the sequences of messages with metadata that the model consumes. The Whisper large-V2 model is priced at $0.006 per minute. This is a huge reduction given that OpenAI CEO Sam Altman once estimated computing costs at a few cents for each chat. 

Already in use. Snapchat is using the ChatGPT API to power the My AI feature available to paid users of SnapChat+. Study aid Quizlet is using it for Q-Chat, a virtual tutor. Instacart will soon roll out Ask Instacart which will reply to customer questions on recipes and food purchases with “shoppable” answers informed by product data from the company’s retail partners.

Also announced. OpenAI also clarified some earlier policies for their service. First, enterprise data submitted through the API will no longer be used for model training or other service improvements unless organizations give their permission. Also, it now requires apps or services using ChatGPT make it clear to customers that they are interacting with a chatbot. This includes identifying all ChatGPT-created content — news, blog posts, etc. — as being written by a bot.

Email:


About the author

Constantine von Hoffman
Staff
Constantine von Hoffman is managing editor of MarTech. A veteran journalist, Con has covered business, finance, marketing and tech for CBSNews.com, Brandweek, CMO, and Inc. He has been city editor of the Boston Herald, news producer at NPR, and has written for Harvard Business Review, Boston Magazine, Sierra, and many other publications. He has also been a professional stand-up comedian, given talks at anime and gaming conventions on everything from My Neighbor Totoro to the history of dice and boardgames, and is author of the magical realist novel John Henry the Revelator. He lives in Boston with his wife, Jennifer, and either too many or too few dogs.

Fuel up with free marketing insights.