I do not want vibe-coding the law, especially criminal law. I do not want vibe-coding the tax rules. I do not want vibe-coding traffic safety.
And, in fact, we won't be governed by AI, even if we are. If we're governed by AI, we're really governed by whoever trained the AI, and/or whoever curated the training data. Do we want to be governed by them? Again, no, with expletives.
If someone assumes AI will become significantly more capable than humans at reasoning through complexity, then I can empathize with their opinion. I was previously convinced (open to) this possibility, but in recent years and the better AI gets the clearer it is to me that it's going to take a lot longer, and the super AGI outcome is a lot harder to see.
I'm sure by the time it could possibly be a feasible and positive option people will be plenty ready for it... So no need to prepare prematurely.
TLDR: I agree with you, but without the expletives.
jfengel•1h ago
Of course it will be every bit as bad as the people who implement it. But that just kinda highlights the core problem.