Remember Sam Altman’s AI top ups like electricity bill? Anthropic is testing it in reality
Anthropic is moving to a usage-based pricing model instead of fixed plans.
It’s cheaper for small teams to start, but heavy users may end up paying more.
This shows AI tools are now being priced based on actual usage, like a utility.
Anthropic has made major changes to how it charges enterprise customers using its Claude AI service. The company is shifting from older fixed price plans to a system where businesses pay a smaller fee for each user seat and then pay separately for expected monthly usage. Earlier discounts for using the app have been removed, which could make some companies spend more overall. However, the basic price per user is now lower than before, so it is easier for small teams to start using it. This change shows that AI companies are moving toward charging based on actual use instead of fixed subscription fees, as more businesses around the world start using AI tools.
SurveyThe company says the new pricing model aims to make access to Claude more flexible for business customers while linking cost more closely to how much the service is used each month. Instead of only paying a fixed monthly amount, firms will now pay a lower fee per user and also plan for expected usage in advance. With this approach the company can match costs more fairly with how much each team uses AI tools for tasks like coding, writing, and customer support.
At the same time, the company has removed earlier discounts on its API access, which could make things more expensive for big organisations. Some customers who expected low monthly use now have to agree to higher spending estimates, even if they don’t actually use that much. This has made some companies worried that their overall costs might increase, especially when they do not use it consistently.
Also read: 5 ways to improve AC efficiency and reduce electricity bills during summer season
However, the starting price per user has been reduced compared to older plans, which makes it cheaper for small teams and new users to get started. For example, in some plans, the price per user has been cut from $30 to $15.
The company is also increasing its computing power by working with big chip providers like Amazon and Google to handle the growing demand from business customers.
Also read: Mark Zuckerberg moves desk to AI lab, codes alongside researchers amid AI race
Moreover, recent reports also say the company’s revenue has grown strongly because more industries are using AI tools. It also said that rapid usage patterns from a small number of customers can lead to faster consumption of computing resources.
This is why it is adjusting pricing rules to balance demand and supply more carefully. In future updates, the company plans to improve efficiency so customers can get more stable performance at a lower cost.
Bhaskar is a senior copy editor at Digit India, where he simplifies complex tech topics across iOS, Android, macOS, Windows, and emerging consumer tech. His work has appeared in iGeeksBlog, GuidingTech, and other publications, and he previously served as an assistant editor at TechBloat and TechReloaded. A B.Tech graduate and full-time tech writer, he is known for clear, practical guides and explainers. View Full Profile