Hi HN,I spent years at Toyota Group trying to digitize the "intuition" of master craftsmen (Shokunin).We often hit a wall where a master could detect a defect but couldn't explain why.In one case (detailed in the post), a master inspected parts by hitting them with a hammer. Frequency analysis showed no difference between "Good" and "Bad" parts. The engineers thought he was guessing.But by applying a framework to extract his "Thinking Process" ($T$) rather than just his "Action" ($A$), we found out he wasn't listening to the pitch—he was listening to the reverberation time (decay rate).Once we changed the feature extraction to focus on the "tail" of the sound, the AI matched his accuracy.I believe we are too focused on training AI on "Action Data" (what they did) and missing the "Thinking Data" (what risk they simulated).I'd love to hear if anyone else has successfully digitized a "gut feeling" in a physical domain?
youki_t•1h ago
This is interesting, but I'm curious about the 'hammer' example. You mentioned frequency analysis failed. Did the 'Thinking' interview directly point you to the 'decay rate' (reverberation), or was that still something the engineers had to figure out by trial and error? It sounds like simple feature engineering, but finding which feature to engineer is the hard part.
yusukekaizen•1h ago