Algorithms tend to optimize us toward well-being as “well-done”: predictable, consistent, uniformly cooked. Safe, measurable, repeatable.
But human experience is closer to “rare”: uneven, risky, asymmetric, and still alive. The parts that matter most are often the ones that don’t fit cleanly into metrics.
If everything becomes optimized, nothing remains interesting. And more importantly, we risk replacing well-being with the monitoring of well-being.
When a life is constantly optimized, scored, nudged, and corrected, it gradually stops being a life that is actually experienced.
Nah. We need move back to the real world being the destination instead of the screen. If the technology is not augmenting your life in meatspace, it's slowly robbing you of your somatic experience and turning you into something more machine-like. Doesn't matter whether the technology is open web or proprietary, the effect is the same.
Algorithms to Live By: The Computer Science of Human Decisions
Book by Brian Christian, Thomas L. Griffiths, and Tom Griffiths
shrewdcomputer•1h ago
This is a nice thought but I think it's wrong. If TikTok, Instagram Reels or YouTube Shorts have proven anything, it's that people don't want to decide they want to consume. It's cynical but it's what the data has shown time and again works for these platforms. Passive consumption is easier for the user and companies know it keeps us online longer.
When you ask people, they will say they want to see who they follow but their behaviour, incentivised by companies, says otherwise.
raincole•1h ago
intothemild•1h ago
raincole•27m ago
I want the algorithm to analyze spammers' behavior and filter them out for everyone. Not analyzing my behaviors to filter content for me.
intothemild•41s ago
A4ET8a8uTh0_v2•49m ago
But is it cynical if it is accurate.