Making your traffic indistinguishable from normal internet traffic has long been a foundational part of effective censorship circumvention, a technique sometimes referred to as “collateral freedom.” If censors can’t differentiate your traffic from other traffic they don’t want to block, then they can’t block you without incurring collateral – and economic – damage.
It turns out this is fundamentally the same challenge AI agents face when trying to crawl webpages. The tools they use are identifiable by various means, whether it’s unique TLS handshakes from Python, Go, Node, etc, or whether it’s details in the page processing that make it clear the requester is not a real browser and/or human.
To get around this, Wick uses Cronet (Chrome's network stack as a standalone lib) as the default for all requests. It’s not a headless browser – just the network layer. Wick works as an MCP server so Claude Code and Cursor pick it up automatically and also supports local API access. I also just got it on Apify's store if you want to try it without installing anything — it works from their datacenter IPs, which surprised me honestly. Importantly, it’s local by default. That way, if you’re running from your house, it can automatically take advantage of your residential IP address, which is less likely to get flagged.
The Cronet version is lightning fast and effective, but it doesn’t do JS rendering. If you need that (a requirement for accessing some sites), we have Wick Pro that does some fancy things on that side too that are very different from other tools out there.
Open source, MIT. Pro for $20/mo if you need JS rendering.
Code: https://github.com/wickproject/wick
Site: https://getwick.dev
adamfisk•10m ago
Wick is also local-first. So it's designed for local agents crawling successfully vs larger scale crawls simply because we haven't built out the infra.