Let's think of hackernews.com as an example. The agent will check the website and will generate functionalities like:
getTopPosts submitPost upvote
etc.
Then it will save and index this page's functions and serve it over a central api.
GET somedomain.com/hackernews-com/getTopPosts
So the idea is to automatically discover and index web pages and make them ready for AI use. An agent can check if the website is indexed and then use it without needing a browser.
What do you think about it ?