They scrape, paste, and forget. I built Oitii to actually analyze the data before showing it to users.
I developed a scoring system (0-100) to filter out low-effort and ghost listings.
1. Hiring Freeze Cross-Check We verify the company's current financial health against real-time layoff data and hiring freeze trackers before indexing the job. If they just laid off 20% of engineering, we warn you.
2. Smart Salary Synthesis We never show "Undisclosed." If the DB has gaps, we parse job.title for seniority keywords (e.g., "Staff" vs "Mid-Level") and synthesize high-fidelity estimates based on current market rates.
3. The "Trap" Detector Our engine flags logical fallacies in the JD. For example, if the Title says "Entry Level" but the Description demands "3+ years of experience," it gets a massive quality penalty.
4. Active Ping & Honeypots. We don't just trust the post. We use proxy applications to track if resumes are actually being opened (pixel tracking). If the "View Rate" is 0% over 2 weeks, the job is marked as dead.
5. The "Growth Signal" Audit (Cross-Platform Fingerprinting). We cross-reference the listing against the company's direct career page and historical aggregator data to catch "investor fluff."
The Logic: We identify jobs that are reposted on aggregators (to look like the company is growing for VCs) but have been removed or never existed on the company's main ATS.
Zombie Detection: If a role has a high repost velocity (e.g., refreshed every 10 days) but no interview movement, it is flagged as a marketing asset, not a job opening.
It’s built with [Python, Next.js, Supabase].
I’d love feedback on the scoring weights.