Problem: I wanted to reduce DX friction when using ai to verify that ai-implemented features perform exactly what the developer had in mind. And I'm stuck now...
Context: First, I thought we could formalize prompts so that they become the code. The idea: a project spec where particular words define behavior, and changing the words changes the program. Each word is a function, so pressing "Go To... (F12)" drops you into a nested behavior (hence "Fractal"). So it would be a meta-spec language interpreted by an ai compiler while still using native language (hence "Native").
But this rests on a wrong assumption — that developers are open to learning a new language (less formalized than TS, Java, or Go, yet still formalized). I don't think that can happen anymore. The next coding language is purely English (or any other natural language).
Second, I discussed the idea with a friend and we realized that tests are a nice descriptor of behaviors (with quite some exceptions ofc). Ideation continued, and long story short, here's the Language Spec: https://github.com/slowestmonkey/fractal/blob/main/README.md
So why am I stuck? I've come to the realization that prompting ai and receiving correct behavior isn't enough for engineer. I can't confidently say "it makes sense" unless I understand how and why it was built that way. Here's the realization: https://www.conjectly.com/thoughts/7
And so I'm asking for help or advice on: - if and how this idea can go further - what other wrong assumptions I've made
Please share your thoughts and thank you.