I built EditorWatch to help CS instructors detect AI-generated code in programming assignments.
Current plagiarism detectors only look at the final code. Students copying from ChatGPT slip through easily. EditorWatch is different - it monitors HOW code is written, not just what's written.
A VS Code extension tracks coding patterns: - Sudden code appearance (paste bursts) - Lack of natural trial-and-error - Robotic typing patterns - Perfect-first-time code
It generates an authenticity score (0-10) with visualizations for each submission.
Privacy-conscious design: - No video/screenshots, only metadata (timestamps, character counts) - Only tracks specified file types (.py, .js, etc.) - Students must explicitly opt-in - Data deleted after grading - 100% open source
Free for education, paid for commercial use. Deploys easily on Railway or your own server.
I know it's not foolproof - determined students can bypass it. But it raises the bar significantly and works as one tool alongside code reviews and oral exams.
Would love feedback from educators and developers on the approach!