I've been following Google's A2A protocol since it launched and noticed that there is still no good way to find agents out on the public internet. They're scattered across GitHub repos, registries, cloud deployments, random subdomains, and many go offline without anyone noticing.
So I built Waggle, which is a search engine that crawls the web for any domain that exposes a valid agent card, indexes them with semantic embeddings, and tracks their health over time. Waggle's index is exposed in a few different ways. First, there is good old search. There is also a REST API for programmatic access. But the feature I'm most excited about is Waggle's A2A-compliant meta-agent that knows how to delegate tasks to other agents. You ask Waggle "What are the GPS coordinates of the Empire State Building", it queries the database, finds a geocoding agent, and hands off your task, then gets back to you when the other agent is done.
The ecosystem is still very young and a lot of agents my crawlers dredge up are still "hello world" demos that are neither useful nor properly implemented. But I'm happy to report that over the course of one month, I went from 8 indexed agents to over 100, with about half being online at any given time. You can try out the Waggle agent / task delegation by using the chat feature (no registration required).
Some known-good agents you can take for a spin: - Cliff the Surveyor (geocoding, earthquake / flood analysis in the US) - Dispute_Email_Agent (functions as advertised) - OpSpawn AI Agent (Tries to do many things, I only had luck with converting markdown to HTML)
In any case, excited to share! Happy to answer questions.