Anthropic’s revenue growth has accelerated sharply, with the company reporting an annualized run rate of roughly $30 billion, up from about $9 billion at the end of 2025. The increase, which effectively represents a near tripling of revenue in just a few months, points to sustained demand for large-scale AI services, particularly among enterprise customers. According to the company, more than 1,000 clients are now spending over $1 million annually, indicating that adoption is being driven less by experimentation and more by operational use cases.
This puts Anthropic ahead of OpenAI in terms of reported run-rate revenue, at least for now. OpenAI recently disclosed monthly revenue of around $2 billion, or approximately $24 billion on an annualized basis. While direct comparisons are complicated by differences in accounting methods and revenue definitions, the broader takeaway is that both companies are scaling rapidly, with Anthropic currently expanding at a faster pace.
The competitive landscape is also shifting in terms of strategy. OpenAI has been placing increased emphasis on enterprise deployments, particularly around coding-focused AI systems and agent-based workflows. Anthropic, meanwhile, appears to be leaning heavily into infrastructure expansion to support continued growth in demand for its Claude models.
To that end, the company has entered into a new agreement with Google and Broadcom to secure multiple gigawatts of next-generation TPU capacity, expected to come online starting in 2027. This level of compute investment reflects the broader trend across the AI sector, where access to hardware is becoming as critical as model development itself. Training and deploying advanced models now requires long-term infrastructure commitments that extend years into the future.
Despite the new partnership, Anthropic continues to rely on a mix of providers. Its models currently run across several platforms, including AWS Trainium, Google TPUs, and NVIDIA GPUs. Amazon remains its primary cloud partner, suggesting that diversification rather than exclusivity is the current approach. One notable aspect of Anthropic’s positioning is that its Claude models are available across all three major cloud ecosystems—Amazon Web Services, Google Cloud, and Microsoft Azure—which could appeal to enterprises looking to avoid vendor lock-in.
Taken together, these developments highlight how quickly the AI infrastructure race is evolving. Revenue growth alone does not fully capture the competitive dynamics; access to compute, distribution across cloud platforms, and enterprise integration are increasingly shaping how these companies differentiate themselves. Anthropic’s recent numbers suggest momentum, but the gap between leading AI providers remains narrow and subject to rapid change.
