Things I’m curious about: - Swift? Hearing a lot about genai lack of effectiveness in the language and in the Xcode platform - Do you understand the codebase enough to provide support to paying customers, or are you confident enough in the models to not care - How are you handling security controls, and are you delegating this to the models? Given you can (likely) capture keypresses this is important - What does your process look like? Toolchain etc
Also, we sorta like these types of shills, if you do it right; most of us are in the same type of boat.
1. Codex setup the skeleton xcode project 2. I prompted it to add a sound for every keystroke and leave placeholder names for the audio 3. I recorded the audio of keystrokes and added to the folder codex told me to add to. 4. I then asked it to review the xcode project from the perspective of apple's app store policies. And then asked it to rectify them all. 5. I ensured there's no API calls or logging of keystrokes (this is arguable the main value add I had during the build) 6. Submitted to the app store. Passed review. 7. Codex for landing page and asked for placeholder images. 8. Submit to HN and have a beer giving a toast to the career I once had
krunkworx•1h ago
With codex gpt 5.2, I basically automated the entire pipeline for creating a mac app. From ideation, to build, to test to deploying and even responding to apple support.
Let me repeat. I didn't do anything.
Yet I created: www.thock.pro
Ridiculous.