This Is How Much Anthropic and Cursor Spend On Amazon Web Services

AI at scale isn't scaling.

[Ed Zitron]

Lots of detail here on what Ed calls the “sub-prime AI crisis”:

“Based on discussions with sources with direct knowledge of their AWS billing, I am able to disclose the amounts that AI firms are spending, specifically Anthropic and AI coding company Cursor, its largest customer.”

On the face of it, these numbers don’t look good. Ed’s analysis shows that Anthropic’s AWS compute spend is well over 100% of revenue.

The hallmark of a scalable startup is that growth decouples from cost. Early on, costs and revenue may track closely, but once scale kicks in, revenue grows exponentially while costs stay relatively flat. Or at least, that’s what investors hope for.

That’s not what’s happening here. Instead, costs appear to be rising more or less linearly with use.

At the same time, we’re seeing more and more (legitimate) worries about data security on AI services. Enterprise customers are being asked to integrate sensitive workflows into systems they can’t audit, hosted in jurisdictions they often can’t control. But many companies can’t, because, well, that data is sensitive. It’s not just about wanting to prevent private data from being used to train; some organizations simply don’t want to, or can’t, share that level of information with an opaque cloud service. Not to mention the environmental impact of these huge models and data centers.

If I was running one of these companies, I’d be thinking a lot about building amazing local models — and selling them. In this world, you’re not powering the queries, and you don’t need to worry about data security overheads. Large companies would run these models on their own infra, for example to provide a common model for engineering while maintaining privacy, and pay a steep licensing fee for the privilege. Models (and maybe chips) could also be licensed to software companies and hardware manufacturers who want to build amazing new devices.

What’s particularly concerning is that so much value is being wrapped up in these businesses that, if they topple, the shockwaves could ripple far beyond the tech sector. An unprofitable technology — no matter how transformative — should never be “too big to fail.” It’s in everyone’s interest for these models to become sustainable, and everyone in tech should reflect on how this was allowed to happen to begin with.

[Link]