There are about 80k active contracts across both platforms right now. Sports, weather, crypto, politics. Tonight's NBA games, BTC up or down in 5 mins. You name it. We trade prediction markets everyday and needed a smarter way to search across all of them.
We started by ingesting everything into Postgres and writing SQL to get insights. Then we plugged in Claude to query the DB for us. Type a question, get structured results. That actually worked and felt like what search should be. But the answers were only as good as the data, and the data was a mess.
The core problem: both platforms structure their data completely differently. One NBA game on Kalshi is dozens of separate contracts, each with its own ticker. Polymarket has the same game as a handful of contracts with different naming. One says "Cleveland," the other says "Cavaliers." You can't search across any of it without cleaning it up first.
So we built a pipeline to clean the data and classify it properly. Every market goes through it. Structured parsing for the predictable stuff, LLM for free-form titles/descriptions the rules can't handle. New markets get picked up and classified within minutes. Not glamorous work but it's what makes the search return the right results.
Some things you can try:
"NBA tonight" — games from both platforms resolving today
"Zelensky markets on Polymarket" — filtered to one platform
"Weather in Chicago today" — Kalshi has an entire weather derivatives market
"Kalshi trending" — sorted by volume
"pokemon" — you'd be surprised what people bet on
If you want to use it programmatically, there's a REST API (GET /api/search?q=...) and an MCP server at attena-api.fly.dev/mcp so AI agents can search prediction markets as a native tool. Docs at attena.xyz/docs/search.
Not everything works perfectly yet. But we built this for ourselves and figured others might find it useful too.
Try it here: https://www.attena.xyz/