I'm exploring a concept to fix the disconnect between "Job Descriptions" and "Actual Work."
The Problem: Companies usually hire with a specific intent (e.g., "We need someone to fix our legacy payment sync issues"), but by the time this reaches the JD, it becomes a generic soup of keywords: "Must have 5+ years Java, Microservices..."
The Hypothesis: Hiring is about solving problems. So why don't we use the problem itself as the JD?
The Proposed Workflow:
Input = The Problem Context (Public or Private): Instead of writing a JD, the hiring manager selects a set of actual tasks/issues from their issue tracker (GitHub, Jira, etc.).
Example (Open Source): "Here is Issue #123 (Memory Leak). Find me someone who has solved similar complexity." Example (Enterprise): "Here is a resolved task regarding high-concurrency database locking. We need another person who can handle this level of work." Vision: This could be done via an API or MCP (Model Context Protocol) server that securely fetches the task context—code diffs, discussions, and complexity metrics. The Match: We compare this "Problem Vector" against a candidate's "Activity Vector" (derived from their git log or contribution history).
Result: "Candidate X is a 90% match because they recently refactored a similar async queue system in Rust, which aligns with the complexity of your input issue." The Privacy Challenge (The Elephant in the Room): For this to work in Enterprise (Private Repos), privacy is paramount.
I'm thinking about a localized extraction layer where the "Problem Context" is sanitized (removing PII/secrets) before it's ever sent to an LLM for analysis. The goal is to extract the pattern of the problem, not the proprietary code itself. My Question: I'm building a prototype for the public GitHub use case first. But for those of you hiring in enterprise: Would you trust a tool that parses your internal issue context (sanitized) to generate a precise candidate requirement profile?
I believe this "Task-based Hiring" is fairer than "Resume Parsing," but I'm curious where you think this model breaks down.