Unless of course you want to expose some functionality only to AIs, not humans. Then sure. But why would you want to do that?
What’s the F is going on? Is the world gone mad or something?
Yes, it's madness but it doesn't matter that it's mad because you can't stop it. It's a technological gold rush, with all of the mixed connotations that "gold rush" should imply.
What’s the F is going on? Is the world gone mad or something?
E-something
I-something
Cyber-something
Crypto-something
AI-something
This, too, will pass. Like Blackberries and car bras.Short answer: Yes.
Although it's not the world proper, but a very loud and well-paid cohort of shills, astroturfers and spin doctors. Plus the occasional useful idiot and me-too hitchhikers, no doubt.
We are, after all, talking about some metadata here you are more than welcome to leave off your site.
I’m not really interested in my website being ai ready, but it’s particularly fascinating to me that they are suggesting and interface for ai agents to make payments to secure access to an api.
Generally, when I want to pay for an api, it would be really wonderful to be able to just direct an ai to setup the account and get me some credentials.
(Hint: no)
I have reduced my online presence to much less than it once was partly because I don't want to feed this machine training data that I've worked hard to make for a human audience.
Like... yeah, no shit; I didn't build it for your regex. It's not the target audience.
Plus, isn't the appeal of LLMs broadly that they can do somewhat-useful things with mostly-arbitrary input (if you ignore the risk of prompt injection)?
Seems like this belongs squarely in the fun and ever-growing collection of "Cloudflare throws vibe-slop into the world and see what sticks".
It will hit exactly the same walls too, namely that the technical details are completely irrelevant - if adopting a standard is actually a negative for websites, because it will separate the site from its users, sites will obviously not do it.
You can lead the horse to water but you cannot make it drink, especially if the water is obvious poison.
Not that I believe this will be how the future turns out, but what if the main users of websites end up being agents? Then adopting the standard ends up being a requirement for survival instead of something negative.
Hopefully and ideally we don't end up there, because then the internet will surely suck for us humans, but I'm not so sure the whole "make platforms/websites open up for the machines" will necessarily fail yet again because of the same issues, can very well be different this time.
[1] https://www.w3.org/community/reports/tdmrep/CG-FINAL-tdmrep-...
We couldn't scan this site isitagentready.com returned 522 <none>
The site appears to be experiencing server errors. This is not an agent-readiness issue. Try scanning again later.
I've redesigned my site to have enough content so that AI knows what I have but they have to send the user to my site to use an interactive JavaScript widget to get the final answer they need. So far so good, but not sure how long that will work for.
So:
- are you certain this "revenue" doesn't come from ads promoting scams? or you simply don't care?
- what do you think about LLMs "licensing" the content so you get royalties instead of putting these artificial obstacles?
How much CPU time an average request takes is probably the most important factor in the real world. No one running a frontier AI lab is going to honor any of the metadata described here.
403 Forbidden
error code: 1106
The site is blocking our scanner. This may be due to WAF rules, bot detection, or IP-based restrictions.
Perfect :)
This has always stuck to me as an example of the pinnacle of collective investment delusion that seems to exist in certain circles. They idea that you can shape the world to your product instead of improving the world with your product. You just have to try hard enough.
"Now, make sure your websites are rigorously structured in such a way that allows the technology to work..."
Fix: Implement the WebMCP API by calling navigator.modelContext.provideContext()
but I already do that. the extension detects them https://chromewebstore.google.com/detail/webmcp-model-contex...
Why do you have a website in the first place?
WesSouza•2h ago
Good.