Read the UN report on the Attention Economy from 2 years ago - https://www.un.org/sites/un2.un.org/files/attention_economy_...
It says a simple thing - Content has vastly exceeded Eyeballs and Time available to consume it all.
So what happens when Supply exceeds Demand by a huge margin?
Attention Economy CEOs (basically the monopoly platforms) have handled this question by doing a fantastic job convincing Content Creators if your content is not getting eyeballs either something is wrong with you, OR you got to pay us more for Reach and Visibility/buy more ads/produce more engaging garbage. If its engaging we will take a cut. If its not, pay us to get the algo to prop you up.
This is a parasitic model which is eating itself.
toomuchtodo•3h ago
Do I want to rely on search engines? Or do I want to live within the Anthropic or ChatGPT client with everything at my fingertips it has trained on (as well as tooling access via the MCP ecosystem)? Desktop->Browser->AI terminal is the rough story arc. Do I want an open web? We haven't had that for a long time; we've had Big Tech building moats and monopolies to siphon up all the value (most recently evident in the Google DOJ antitrust suit, their ad monopoly, potentially being forced to divest Chrome, their agreement for default search with Apple, and so on). Generative AI is a watershed moment where users can get some control back over how they consume and ETL the data they are interested in, and this is not great for incumbents.
meheleventyone•3h ago
toomuchtodo•3h ago
lelanthran•2h ago
> Or do I want to live within the Anthropic or ChatGPT client with everything at my fingertips it has trained on (as well as tooling access via the MCP ecosystem)?
If advancements in hardware continue at a fairly rapid pace, we'll all eventually be using large models locally. We won't be talking about MCP and requests to OpenAI, Anthropic or Gemini, we'll be talking about which of the latest models to download.
Pricing-wise, right now, it's probably not that expensive to set up a free LLm on a server in your house that everyone will use.
The only "moat" that there is, is the model weights. And the only way to get a trained model is by slurping content.
So, sure, while right now everyone is going to Anthropic and ChatGPT for answers, pretty soon everyone will be going to whoever has the most current model. And that's where Cloudflare can make a killing, because they are literally serving so much content, they can train their own model on the content that is passing through without any need to run a bot.
aurareturn•29m ago
I think where Cloudflare can make a killing is policing which AI agent can access which service. IE. Content providers use Cloudflare to block AI agents/bots. AI Agents/bots pay Cloudflare for easier access.
znpy•3m ago
I think you're missing an important aspect: CloudFlare is in the unique position to be able to:
1. train a model on content that's actually getting visited by people (they are fairly good at cutting out bots)
2. They don't even have to scrape the website, they can collect the content while they serve request. it would be completely transparent (zero cost, actually a benefit) to website owners.
@jgrahamc if you're reading this and you end up doing what i wrote, I want a cut of the profits (or a job offer) :P