Thanks for this, Daniel! I've been working on a related piece [1] that arrives at Naur from a different angle.
Your "knowledge debt" framing is valuable, it gives practitioners something
concrete. And your emphasis on team practices (reviews assessing understanding, pair programming around AI suggestions) addresses social dimension I neglected.
Where we diverge: I argue Naur's pessimism is narrower than supposed. His critique targets technical documentation that describes artifacts while presupposing the theory needed to understand them. But theory-focused documentation -WHY and WHEN, not just WHAT and HOW- isn't subject to the same critique. The Renaissance transmitted Aristotelian theory across a thousand-year gap via preserved texts.
The key distinction: Naur's argument is epistemological, not economic. He's not saying we lack time to document, he's saying certain knowledge cannot in principle be articulated. If the problem was labor, AI would solve it. If it's epistemological, AI can't.
But AI does change something: forced articulation. The interlocutor effect means reasoning that would have remained implicit gets spoken and preserved. Not a solution: better odds.
I posted this recently [2] without much traction, so I'm glad your piece is reaching people. These ideas deserve discussion.
xrrocha•1mo ago
Your "knowledge debt" framing is valuable, it gives practitioners something concrete. And your emphasis on team practices (reviews assessing understanding, pair programming around AI suggestions) addresses social dimension I neglected.
Where we diverge: I argue Naur's pessimism is narrower than supposed. His critique targets technical documentation that describes artifacts while presupposing the theory needed to understand them. But theory-focused documentation -WHY and WHEN, not just WHAT and HOW- isn't subject to the same critique. The Renaissance transmitted Aristotelian theory across a thousand-year gap via preserved texts.
The key distinction: Naur's argument is epistemological, not economic. He's not saying we lack time to document, he's saying certain knowledge cannot in principle be articulated. If the problem was labor, AI would solve it. If it's epistemological, AI can't.
But AI does change something: forced articulation. The interlocutor effect means reasoning that would have remained implicit gets spoken and preserved. Not a solution: better odds.
I posted this recently [2] without much traction, so I'm glad your piece is reaching people. These ideas deserve discussion.
[1] https://xrrocha.github.io/solo-dev-musings/001-naur-document...
[2] https://news.ycombinator.com/item?id=46378885