Give it a topic and it: generates diverse search queries, searches the web (Brave Search API, free tier), fetches and reads relevant sources, analyzes each source for key findings, and synthesizes a structured markdown report with citations.
Everything runs locally — no OpenAI/Anthropic API needed. Just Ollama + llama3.1:8b.
It takes about 15 minutes per research run on a mid-range CPU (Ryzen 5 5500, no GPU needed). Not fast, but it does the research while you do other things.
Tech stack: C#/.NET 8, Ollama, SQLite for semantic memory, Brave Search API.
Why C# instead of Python? Because .NET developers want AI tools too, and the ecosystem is underserved compared to LangChain/LlamaIndex.
Known limitations: CPU inference is slow (~15min/run), 8B models occasionally produce malformed tool calls (handled with retries), and research quality depends on Brave Search results for your topic.
I also made a starter kit if you want to build your own agent from scratch: https://github.com/DynamicCSharp/agentkit