LinkedIn and X are exploding the past couple of days, and on both I see predominantly two camps right now. One says AI can now build production grade apps while you're in bed. The other says it's all hype and if you can't read the code you'll get nothing more than a shopping list app or a fancy page that has the input bar cut out to where you're just typing directly to ChatGPT.
Both are wrong.
I've spent the last 15 months banging on LLMs, pushing them past their coding limits with GPT-4 and Sonnet 3.5 and continued - every day and every night - as I watch them incrementally get better. And things got easier. I was able to drop parts of required processes and frameworks I had developed to keep them coherent across thousands of sessions and hours. I was able to leverage better research capabilities to teach them what they didn't know. Remember longer as context window sizes increased, and no longer require the translation dictionary we built together to translate coding systems to mechanical systems so my brain could troubleshoot for them when they hit an immovable wall.
What I built:
An automated content platform with a custom research engine. It runs a 15-phase pipeline built from primitives, not a research tooling API. It reads hundreds of pages and PDFs, follows citations and identifies primary sources, evaluates if the research is good enough, iterates if it isn't, synthesizes what is true from the entirety of research but not in any single one, then writes and refines like an editor. The output has inline hyperlinked citations to the source and a full expandable index in the completed article that shows every source it used — 42 sources, 130 findings, extracted quotes and who said them, clickable verification. Every claim is verifiable.
Built on journalism principles. It writes like it, not AI slop.
The workflow is frictionless: pick a topic, pick a style, generate ideas, click generate article, one-click publish to WordPress or Ghost
The friction continues to drop with each incremental GPT, Claude and Gemini release, but one thing persists without improvement over that time.
It DOES NOT THINK IN SYSTEMS. It thinks in tasks.
It is an executor. A refinement tool of your own cognition. It is a backhoe for the mind.
I can't read code. I don't know what a function is. I still can't read a line of what I shipped. But I can tell you I couldn't walk away for 5 minutes. Every response required evaluation and collaboration. It needed me to hold the system as a whole in my mind at all times. It required learning how to speak with a non-human intelligence and learning how it functions by feel and then again as a new model was released. It required me to think in first principles, ask what's possible and how to find out, be willing to drop what didn't serve me even when I spent hundreds of hours on it to that point.
I don't know if I felt embarrassed, because I didn't see it as mine, or me that built it, or what exactly... But I realized I was thinking from my own training data. I was thinking from the frame that the world will soon leave behind. No different than when at one time, we considered that the machine built the electronic music, not the man orchestrating it.
AI will at some point build what I have, alone — while I sleep one day. And when it does, there will be me and others orchestrating new things, things that we can't quite imagine yet.
Those that can invent systems and think in them will not only have a place in the future, they will excel as the task-bearing loads of execution continue to fall to the wayside.
I am not a developer who used AI to accelerate their work. I am a systems architect who found a way to build.
By the old definition, you're right. I can't be a systems architect.
But this product exists. I built it. 15-phase pipeline. Multi-service architecture. Production.
That's not a contradiction to resolve. That's evidence the definition is changing.
You're using a frame that assumes understanding code is required to hold a system together. I'm the proof it isn't anymore.
techblueberry•1h ago
You understand code.
adambuildstuff•1h ago
I promise you with everything I have I do not. I had to ask AI to quiz me to learn what was the simplest term in coding that I did not know. I failed at question 1 - what is a function.
adambuildstuff•1h ago
Both are wrong.
I've spent the last 15 months banging on LLMs, pushing them past their coding limits with GPT-4 and Sonnet 3.5 and continued - every day and every night - as I watch them incrementally get better. And things got easier. I was able to drop parts of required processes and frameworks I had developed to keep them coherent across thousands of sessions and hours. I was able to leverage better research capabilities to teach them what they didn't know. Remember longer as context window sizes increased, and no longer require the translation dictionary we built together to translate coding systems to mechanical systems so my brain could troubleshoot for them when they hit an immovable wall.
What I built: An automated content platform with a custom research engine. It runs a 15-phase pipeline built from primitives, not a research tooling API. It reads hundreds of pages and PDFs, follows citations and identifies primary sources, evaluates if the research is good enough, iterates if it isn't, synthesizes what is true from the entirety of research but not in any single one, then writes and refines like an editor. The output has inline hyperlinked citations to the source and a full expandable index in the completed article that shows every source it used — 42 sources, 130 findings, extracted quotes and who said them, clickable verification. Every claim is verifiable. Built on journalism principles. It writes like it, not AI slop. The workflow is frictionless: pick a topic, pick a style, generate ideas, click generate article, one-click publish to WordPress or Ghost
The friction continues to drop with each incremental GPT, Claude and Gemini release, but one thing persists without improvement over that time.
It DOES NOT THINK IN SYSTEMS. It thinks in tasks.
It is an executor. A refinement tool of your own cognition. It is a backhoe for the mind.
I can't read code. I don't know what a function is. I still can't read a line of what I shipped. But I can tell you I couldn't walk away for 5 minutes. Every response required evaluation and collaboration. It needed me to hold the system as a whole in my mind at all times. It required learning how to speak with a non-human intelligence and learning how it functions by feel and then again as a new model was released. It required me to think in first principles, ask what's possible and how to find out, be willing to drop what didn't serve me even when I spent hundreds of hours on it to that point.
I don't know if I felt embarrassed, because I didn't see it as mine, or me that built it, or what exactly... But I realized I was thinking from my own training data. I was thinking from the frame that the world will soon leave behind. No different than when at one time, we considered that the machine built the electronic music, not the man orchestrating it. AI will at some point build what I have, alone — while I sleep one day. And when it does, there will be me and others orchestrating new things, things that we can't quite imagine yet. Those that can invent systems and think in them will not only have a place in the future, they will excel as the task-bearing loads of execution continue to fall to the wayside.
I am not a developer who used AI to accelerate their work. I am a systems architect who found a way to build.
https://articlefoundry.com
techblueberry•1h ago
https://en.wikipedia.org/wiki/Function_(computer_programming...
Similar in concept to:
https://en.wikipedia.org/wiki/Function_(mathematics)
adambuildstuff•1h ago
techblueberry•1h ago
adambuildstuff•1h ago
techblueberry•25m ago