Anecdote, but it does seem like a lot of younger folks I speak with are exhausted by the dark patterns and dopamine extraction that top-k social media platforms create.
If agents/AI/bots inadvertently destroy the current incarnation of social media through noise, I think we'll be better for it.
Do you have a mechanism for this in mind, incentives-wise? I can't see this making money.
The incentives would be those which have motivated people throughout history: to create something which benefits humanity.
Next, text only platforms are nice, but niche on the modern internet. People seem to love multimedia which takes tons of bandwidth/cpu.
Paid for services don't mean spam free either. If it's worth people to pay for, it's worth spammers paying to get in and spam.
Then you have all the questions on what happens if you grow, how do you deal with working with all the laws around the world, how do you deal with other legal issues.
Having a site/service of any size can quickly become an expensive mess.
(If we hit the stretch goal, we can upgrade to a raspberry pi!)
Said little sites may run for a bit and die, and the massive monolith remains, at least until another monolith replaces them.
(EDIT: to clarify, I don't mean to build an alternative monopoly, I mean to build alternatives that are big enough to survive as a business, and big enough to be useful; A few million users as opposed to the few billions Facebook and Youtube (allegedly) have)
The reason it's hard to imagine such a thing today is because the tech giants have illegally suppressed competition for so long. If Google or Meta were ordered to break up, and Facebook/Youtube forced to try and survive as standalone businesses, all the weaknesses in their products would manifest as actual market consequences, creating opportunity for competitors to win market share. Anybody with basic coding skills or money to invest would be tripping over themselves to build competing products which actually focus on the things people want or need, because consumers will be able to choose the ones they like.
We've tied our incentives to a structure which is not in alignment with continued survival. The real question is how can we incentivize ourselves to continue to exist?
The "the incentive structure says we should all destroy our brains" thing is just a small aspect of that.
Getting back to community is key.
This sounds like the original internet.
Before adtech took over.
They are going to be (and AI slop already is) so much worse. Once they get ads to work well / seem natural the dark patterns will pop right back up and the money spigot will keep flowing upwards
Given that this is a case about addiction, that feels like a shockingly bad thing to say in defense of your product. Can you imagine saying the same thing about oxycodone or cigarettes?
[0] https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-so...
I also hope the reasons are obvious.
Personally, I am leery of any technical definition of “addictive” that operates outside the traditional chemical influences on physiology. So I would not describe gambling in that sense.
One might have a malady that causes gambling to take on the same physiological vibe for you, but that’s not what it means for gambling itself to be addictive.
If that is the (heavily simplified) case, is there a distinction for you between a chemically-induced dopamine release from smoking and, say, and a button you can press that magically releases dopamine in your brain?
Not careful enough apparently: Nicotine isn't that addictive on its own, tobacco is.
That is a very strong claim to make when the current scientific consensus strongly disagrees.
For example see the glossary in https://en.wikipedia.org/wiki/Substance_dependence
What about the "infinite" broadcasts found on all television channels?
This is ridiculous and pathetic.
Maybe the social media companies could do more to combat all these. They certainly have a level of profit compared to what they provide to the average person that makes people squirm.
But does anyone believe for a second that YouTube is responsible for a person's internet / video watching addiction? It's like saying cable television is responsible for people who binge watch TV.
It's hard to square this circle while sports gambling apps and Polymarket / Kalshi are tearing through the landscape right now with no real pushback
Yes? Is there an algorithm or not?
Broadly speaking, Section 230 differentiates between publishers and platforms. A platform is like Geocities (back in the day) where the platform provider isn't liable for the content as long as they staisfy certain requirements about havaing processes for taking down content when required. A bit like the Cox decision today, you're broadly not responsible for the actions of people using your service unless your service is explicitly designed for such things.
A publisher (in the Section 230 sense) is like any media outlet. The publisher is liable for their content but they can say what they want, basically. It's why publishers tend to have strict processes around not making defamatory or false statements, etc.
I believe that any site that uses an algorithmic news feed is, legally speaking, a publisher acting like a platform.
Example: let's just say that you, as Twitter, FB, IG or Youtube were suddenly pro-Russian in the Ukraine conflict. You change your algorithm to surface and distribute pro-Russian content and suppress pro-Ukraine content. Or you're pro-Ukrainian and you do the reverse.
How is this different from being a publisher? IMHO it isn't. You've designed your algorithm knowingly to produce a certain result.
I believe that all these platforms will end up being treated like publishers for this reason.
So, with today's ruling about platforms creating addiction, (IMHO) it's no different to surfacing content. You are choosing content to produce a certain outcome. Intentionally getting someone addicted is funtionally no different to changing their views on something.
I actually blame Google for all this because they very successfully sold the idea that "the algorithm" ranks search results like it's some neutral black box but every behavior by an algorithm represents a choice made by humans who created that algorithm.
Edit to include: I mean this is coming the same day as the Supreme Court throwing out the piracy case against Cox Communications 9-0. Remember that this case originated with $1 billion dollar jury verdict against them! Was reversed by an appeals court 5 years later and completely invalidated today. Juries should not handle complex civil litigation, I'm sorry
ChrisArchitect•1h ago
Jury finds Meta liable in case over child sexual exploitation on its platforms
https://news.ycombinator.com/item?id=47509984
SpicyLemonZest•59m ago
aprilthird2021•48m ago
Even if they do what you're saying, lots of people who've used any Meta property in the last 15 years has a potentially viable case, and no future work can swat those away