Publishers facing existential threat from AI, Cloudflare CEO says
Ten years ago, Google crawled two pages for every visitor it sent to a publisher. Today, Anthropic crawls 60,000.
Although it must be noted that he's got a vested interest in playing up the problem - his company is releasing a tool to prevent AI bot scraping - these numbers shared by Cloudflare CEO Matthew Prince are incredibly damning.
Ten years ago, Google crawled two pages for every visitor it sent to a publisher.
Today, according to Cloudflare's stats, Anthropic crawls 60,000; OpenAI crawls 1,500; Google crawls 18. That means that while people are seeing the benefit of the information publishers provide, the publishers are being cut off from much-needed ad, subscription, or donation revenue.
As Prince says:
"People trust the AI more over the last six months, which means they're not reading original content [...] The future of the web is going to be more and more like AI, and that means that people are going to be reading the summaries of your content, not the original content."
Make no mistake: this is not limited to chatbots like ChatGPT. These AI features are making their way into email inboxes, inserting themselves into the relationship between newsletters and their readers, and into web browsers themselves.
This compounds existing trends. Publisher revenue is already being squeezed: an ongoing concern for decades now. AI interfaces that stripmine publishers are going to make this problem much worse. In a world where people read the reporting but never form even a pageview-deep relationship with the publisher that funded it, how does journalism continue to function?
[Link]