> When you ask an AI to solve a problem top-down, it starts "easy" as it generates the code that "just works". As the code evolves, it becomes interleaved with the process in which you arrived at the final solution. It patches logic based on immediate constraints. It carries the scar tissue of every "no, not like that" instruction you gave it. The code eventually represents the history of your struggle to articulate what you wanted.
Agentic planning is supposed to address the short term bias. But yes, I feel the coding AIs are missing something that experienced developers bring. Maybe an intuition and anticipation for change?
stackdiver•40m ago
Agentic planning is supposed to address the short term bias. But yes, I feel the coding AIs are missing something that experienced developers bring. Maybe an intuition and anticipation for change?