But I feel emotionally conflicted because of how different it is from how I write.
And I have a suspicion that I need to get over it.
This is faster, produces something of value, and has a small chance of catching errors/bias in thinking. It's the same flow I use while writing code.
Writing from scratch now feels like those cooking videos where people make all the ingredients from scratch: rewarding but also maybe a tiny bit self-indulgent.
This feels like the 21st century: You bring the skills, the model brings the words.
roxolotl•8mo ago
That’s not quite what I get from this piece. I read it more as saying that the author uses the LLM to clarify their thoughts and maybe even suggest topics and they wordsmith the output of the model. It even ends with suggesting you write your own conclusion. Which personally I think is much more sustainable than letting the model bring the words.
nilirl•8mo ago
My last comment was not specific to this post; it's a comment in general.
Composition skills are no longer a barrier to useful writing. If you have a sense of an idea and try a lot with an LLM, you can arrive at something useful and readable.
Which is a positive.
roxolotl•8mo ago
Yea I don’t think I entirely agree. Yes the barrier to is lowered but you still need to know what good writing looks like to produce good writing even when aided.
No, passing an inverse-turing test while writing can't be the goal.
The onus is on the reader to accept an argument; irrespective of source.
Writing, of any sort, can no longer be used as proof of 'written entirely by human'.
cnunciato•8mo ago
> It’s a linguistic uncanny valley, close enough to human to be recognizable, but different enough to be repulsive.
I love this. It's exactly how I feel when I read AI-generated content. A little bit sick, not really sure why.
MD87•8mo ago
I totally get how LLMs can help you write, especially in the collaborative way described. But as a reader do I actually want to read that? Maybe for documentation or something it's fine, but if you're trying to convey an opinion or make a human connection it feels a bit... cheap?
delichon•8mo ago
You can convey human emotions with the aid of a guitar to improve your singing or an LLM to improve your writing. One is a musical instrument, the other is a prose instrument. Is it cheapening the human connection to sing with the help of a guitar? That depends on your skill with the instrument.
yodon•8mo ago
Prior to reading this post, my biggest concern about LLM-assisted activities was I never felt like I was hitting a flow state when using an LLM, regardless of how productive I was able to be using the LLM. That lack of flow state left me feeling like I wasn't bringing 100% of what I could bring to the effort.
This article feels like useful insights into how to help myself get into a proper flow state when working with an LLM.
nilirl•8mo ago
But I feel emotionally conflicted because of how different it is from how I write.
And I have a suspicion that I need to get over it.
This is faster, produces something of value, and has a small chance of catching errors/bias in thinking. It's the same flow I use while writing code.
Writing from scratch now feels like those cooking videos where people make all the ingredients from scratch: rewarding but also maybe a tiny bit self-indulgent.
This feels like the 21st century: You bring the skills, the model brings the words.
roxolotl•8mo ago
nilirl•8mo ago
Composition skills are no longer a barrier to useful writing. If you have a sense of an idea and try a lot with an LLM, you can arrive at something useful and readable.
Which is a positive.
roxolotl•8mo ago
jackstraw14•8mo ago
nilirl•8mo ago
The onus is on the reader to accept an argument; irrespective of source.
Writing, of any sort, can no longer be used as proof of 'written entirely by human'.
cnunciato•8mo ago
I love this. It's exactly how I feel when I read AI-generated content. A little bit sick, not really sure why.