They were not wrong to be skeptical. The hiring space is littered with failed companies.
But we built it anyway. Out of sheer necessity.
Our small startup was getting 2200 applicants a week for programming roles, and we needed a way to evaluate them fairly — and at scale.
The core issue we saw is that most ATSs are just passive tables with a UI. A candidate applies, and then they sit there. We needed an active, event-driven system that evaluates candidates on their skills. A workflow engine that pushes the vetting process forward and recommends engineers based on their actual skill set.
So we built Hivemind, a workflow-first ATS.
To make it useful, we built the assessment tools we always wanted: sandboxed project environments using Firecracker microVMs + Kata Containers (with a Monaco editor) to test real-world coding skills (not just LeetCode), and asynchronous video questions for conceptual questions that can be graded by you or by AI.
For any step that requires human input, all tasks are forwarded to a centralized inbox so you’re not digging through workflows. And for live interviews, we built a co-pilot that records, transcribes, and summarizes the conversation, adding notes automatically to a central report card.
We built all this to solve our own hiring pain, and today it powers the system that vets ~9,000 candidates a month at our staffing company. We’re also launching it today on Product Hunt.
I know HN has strong opinions on hiring for engineering roles, and I’d be grateful for your feedback, criticism, and ideas.
Thanks for reading.