I’ve been building with LLMs a lot lately and recently came across a report showing the growing emissions from AI workloads, not just training, but inference too. It got me thinking about all the computing we’re using every day, even in small apps.
So I started working on EmitMind, a platform that tracks and offsets the carbon footprint of your AI tools like GPT, Claude, or any API-based model. The idea is to integrate once and let it estimate usage + emissions, then automatically buy verifiable offsets.
Still very early, just a waitlist up for now, but I’d love feedback on the concept, implementation ideas, and whether anyone here is already thinking about sustainability in their AI stack.
joeharwood3•3h ago
So I started working on EmitMind, a platform that tracks and offsets the carbon footprint of your AI tools like GPT, Claude, or any API-based model. The idea is to integrate once and let it estimate usage + emissions, then automatically buy verifiable offsets.
Still very early, just a waitlist up for now, but I’d love feedback on the concept, implementation ideas, and whether anyone here is already thinking about sustainability in their AI stack.
Site: https://emitmind.com