Three major AI stories landed on May 4, 2026. Anthropic's annualised revenue crossed the $30 billion mark, IBM opened its Think 2026 conference in Boston, and DeepSeek shipped V4 — the largest open-weight model ever released.
Anthropic Overtakes OpenAI in Annual Revenue
For the first time, Anthropic's annualised recurring revenue has surpassed OpenAI's. Anthropic reached a $30 billion ARR run rate in early May 2026, while OpenAI trailed at $24 billion. The milestone was reported alongside news that Anthropic had finalised a $1.5 billion joint venture with Blackstone, Hellman & Friedman, and Goldman Sachs — with Apollo Global Management, General Atlantic, Leonard Green, GIC, and Sequoia Capital also participating. The vehicle is structured as a forward-deployed enterprise services firm that embeds Claude directly into the operations of private equity-backed companies. Anthropic CFO Krishna Rao described it as a response to enterprise demand "significantly outpacing any single delivery model." The joint venture gives Anthropic a distribution channel into PE portfolios at a scale no software vendor has previously had.
IBM Think 2026 Opens in Boston
IBM's annual Think conference opened May 4–7 in Boston, drawing more than 5,000 senior business and technology leaders from over 80 countries. CEO Arvind Krishna used the opening keynote to preview IBM's most comprehensive set of enterprise AI announcements to date. Products unveiled include the next generation of IBM watsonx Orchestrate for multi-agent orchestration, IBM Confluent for real-time data feeding into AI, IBM Concert for intelligent operations, and IBM Sovereign Core — a platform that embeds governance policy at the infrastructure runtime level for regulated industries. IBM Bob, a new AI-powered software development solution covering the full SDLC from code generation to testing and deployment, was also announced. IBM Granite 4.1, an 8 billion parameter model delivering performance comparable to 32 billion parameter MoE models, shipped alongside the conference announcements.
DeepSeek V4: 1.6 Trillion Parameters, Frontier Pricing Undercut
DeepSeek shipped V4 Flash and V4 Pro as preview models on April 24, 2026, with broad coverage landing May 4. The Pro model carries 1.6 trillion total parameters (49 billion active), making it the largest open-weight model available — surpassing Kimi K2.6 (1.1 trillion) and more than doubling DeepSeek V3.2 (671 billion). Both models offer 1 million token context windows. V4 Flash is priced at $0.14 per million input tokens; V4 Pro at $0.145 per million input tokens and $3.48 per million output tokens. Both undercut GPT-5.4, GPT-5.5, Claude Opus 4.7, and Gemini 3.1 Pro on input pricing. DeepSeek claims V4 Pro's performance on coding benchmarks is "comparable to GPT-5.4," though both models trail on knowledge and multimodal tasks. V4 is text-only; neither version supports audio, video, or image input.
For a full AI model comparison across pricing and benchmarks, see the May 2026 AI News Hub.