I built an MCP server that lets Claude (or any MCP client) orchestrate structured debates between multiple LLMs.
You give it a topic, it sends it to all configured models in parallel, they see each other's responses, refine their positions over multiple rounds, and a synthesizer produces a final consolidated output.
Works with OpenAI, DeepSeek, Groq, Ollama, Mistral, Together — anything with an OpenAI-compatible API.
spranab•1h ago
You give it a topic, it sends it to all configured models in parallel, they see each other's responses, refine their positions over multiple rounds, and a synthesizer produces a final consolidated output.
Works with OpenAI, DeepSeek, Groq, Ollama, Mistral, Together — anything with an OpenAI-compatible API.
npx brainstorm-mcp