There are many ways AIs differ from real people and any conclusions you can draw from them are limited at best -- we've had enough bad experiments done with real people
https://en.wikipedia.org/wiki/Stanford_prison_experiment#Int...
For us layman, the flaw of using AI trained on people for surveys is, human. Humans have a unique tendency to be spontaneous, wouldn’t you say?
How would a focus group research team approach this when they’re bombarded by AI solutions that want their research funds?
>unless they involve Donald Trump
A sense of shame perhaps. If you ask someone "how often do you brush your teeth" and compare it to more pragmatic testing, you see people have some sense of wanting to give the "right" answer. Even in a zero risk anonymous survey.
It's so weird to live in a time when what you just said needs to be said.
But i have a footnote of "yes" because as you said, decision makers are just not interested in having this discussion about "focus on making fun games". So it will unfortunately affect my job in the short and even medium terms. Because so much of big money in games these days is in fact not focused on making a game, but on trying to either generate a gambling simulator, an engagement trap, or (you guessed it) AI hype. Both to try and claim you can just poof up assets, and to try and replace labor.
Knowing this, I do have long term plans to break out into my own indie route.
I would assume there is so much in the corpus based on behavior optimized for the actual existing social media we have that the behavior of the bots is not going to change because the bot isn’t responding to incentives like a person would it’s mimicking the behavior it’s been trained on and if there isn’t enough training data of behavior under the different inputs you’re trying to test you’re not actually applying the “treatment” you would think you are.
A YC company just launched doing exactly that.
There are apps like Meetup, but a lot of people just find it too awkward. Introverts especially do not want to meet just for the sake of meeting people, so they fallback on social media.
Maybe this situation is fundamentally not helped by software. All of my best friendships organically formed in real-world settings like school, work, neighborhood, etc.
This is at core a 3rd places issue, haven't had the capital to restart it post covid.
We intentionally set out to create a social club/co-working space. A lot goes into it. I'm a non-theist who comes from a multi generational group of theist church planters (like 100s of churches, just over and over), it's a multi factorial process with distinct transitions in space-time and community size, where each transition has to be handled so you don't alienate your communities founding members (who will be very different from later members) and still are able to grow.
People don't do it because they can't see the value while they are in the early mess of it. You have to like people to pull it off, you have to NOT be a high control person who can operate at high control at certain developmental stages. You have to have a moral compass everyone understands and you are consistent with, tech people like 0 trust. You have to create a maximum trust environment which means NOT extracting value from the community but understanding that the value is intrinsic in the community.
You have to design a space to facilitate work and play. It's not hard but you have to get everything right, community can't have mono culture and it must be enjoyable/uncomfortable, and you must design things so people can choose their level of engagement and grow into the community. It's easier once it has enough intertia that they understand they are building a thing with real benefits.
Even things like the flow of foot traffic within the space, chokepoints narrowing, these kinds of thing all effect how people interact.
Because these 3rd spaces are open to anyone and probably bringing people in from internet commmunties. What do you do when someone comes along and they're not breaking any rules but its clear that no one likes them? I've seen it drive entire groups away but because the person has done nothing wrong I cant/dont want to just say "fuck off kid no likes your weird ass"
It isn't a single conversation or room. It isn't a single relationship cluster.
And someone is paying the insurance bill. Ultimately that person gets final say. A successful 3rd place can have tension, but not fools (in the economic sense, a person whose decision making and choices actively destroy the groups financial opportunities and stability. Theoretically a very clumsy person might be banned.)
There's a lot to unpack, I mean social dynamics, group polity, these really start with understanding what you really want to see brought into the world and why.
These things take fluidity, nuance and effort to get off the ground. Sometimes they just get lucky too. It's hard to tell ahead of time, proximity is important, population density is important, parties are important.
Getting initial traction is the hardest part, most groups die before they get a chance to have problems, that's why tying the groups space to a stable economic engine like an anchor company or a mission agency is so critical for repeatable success. Same with a small friend group as initial anchor, they just can't all be clique people.
The one thing that's very important is to always frame the space as a place for creating and experimenting, celebrate the amateur and the pursuit of mastery, getting started with a group of taste makers instead of doers will 100% kill your group.
Getting started with hypernetworkers who don't do things will kill your group. They'll show up and they are important later on.
All of these people have a place eventually but they can't form the dynamic core. The whole goal is to eventually have a place for every type of person so they can contribute to the whole and find the space where they can be celebrated, but a group has to be larger before certain types of people can bring their core skills to bear.
Functioning, welcoming, and well-ran communities are the only thing that solves this. Unfortunately, technology often makes this worse, because it creates such a convenient alternative and also creates a paradox of choice. I.e. people think "when there's 1000 meetups to check out, and this one isn't perfect, I'll just move onto the next one" when actually it's the act of commitment that makes a community good.
A third place would fix this, especially for men who need "things". You go to a bar for "thing" and if you meet some others to yell at sports with, bonus. We have less "things" for gen Z, and those things happen rather infrequently in my experience. I'm not sure if a monthly Meetup is quite enough to form strong bonds.
> "The [structural] mechanism producing these problematic outcomes is really robust and hard to resolve."
I see illegal war, killing without due process, and kleptocracy. It's partly the media's fault. It's partly the peoples' fault for depending on advertising to subsidize free services, for gawking, for sharing without consideration, for voting in ignorance.
Social media reflects the people; who can't be "fixed" either.
If you're annoyed with all of these people on here who are lesser than and more annoying than you, then stop spending so much time at the bar.
Can the bar be fixed?
“No smoking, gambling, or loose women.”
TaDAaaah!
Widespread adoption before understanding risks - embraced globally before fully grasping the mental health, social, and political consequences, especially for young people.
Delayed but significant harm - can lead to gradual impacts like reduced attention span, increased anxiety, depression, loneliness, and polarization
Corporate incentives misaligned with public health - media companies design platforms for maximum engagement, leveraging psychological triggers while downplaying or disputing the extent of harm
- Smoking feels good but doesn't provide any useful function.
- Some social media use feels good and doesn't provide any useful function, but social media is extremely useful to cheaply keep in touch with friends and family and extremely useful for discovering and coordinating events.
Fortunately the "keep in touch" part can be done with apps that don't have so much of the "social media" part, like Telegram, Discord, and even Facebook Messenger versus the main app.
Now that smoking is gone in the US, a lot of that is dead, and for the worse. A lot of places now don't even have balconies or other outdoor sections. Every piece of area is dedicated towards productive activity, whether that be shopping, working, or eating. There's very little "just sit around" places in modern businesses and workplaces. The joy of random encounter has been significantly decreased. Everything is now in passing.
It definitely explains the different types of thinking that I'm making up our current society, including social media. I haven't got to the part yet where he suggests what to do about it, but it's fascinating insight into our human behavior in this day and age.
I think this is expected. Think back to newsgroups, email lists, web forums. They were pretty much all chronological or maybe had a simple scoring or upvoting mechanism. You still had outrage, flamewars, and the guy who always had to have the last word. Social media engagement algorithms probably do amplify that but the dysfunction was always part of it.
The only thing I've seen that works to reduce this is active moderation.
To make money, social media companies need people to stay on as long as possible. That means showing people sex, violence, rage and huge amounts of copyright infringements.
There is little advantage in creating real-world consequences for bad actors. Why? because it hurts growth.
There was a reason why the old TV networks didn't let any old twat with a camera broadcast stuff on their network, why? because they would get huge fines if they broke decency "laws" (yes america had/has censorship, hence why the simpsons say "whoopee" and "snuggle")
There are few things that can cause company ending fines for social media companies. Which means we get almost no moderation.
Until that changes, social media will be "broken"
So social media can't be fixed. Incentives are what matter.
But, if you think how closely network TV was regulated, by a government regulator, despite the power that those networks wielded (and against the incumbent radio and newspapers) We know it has happened.
The issue is that government in the USA has been dysfunctional for >30 years, not that regulation is ineffective.
Seriously though, I disagree. Social media in a profit-seeking system can work if the users are the ones who pay. The easiest way for this to work-now that net neutrality is no longer a thing-is bundling through user's phone bills. If Facebook et al. were bundled similarly to how Netflix, Hulu and other streaming apps are now packaged with phone plan deals, then the users would be the focus, not the advertisers. This might require that social media be legislatively required to offer true ad-free options, though.
I mean, it will never happen, but I think it's a path that resolves a lot of problems, and therefore a fun thought experiment.
Investing more in education is great, but you seem to think it should be everyone for themselves. I'd prefer systemic solutions to systemic problems.
None of these approaches offer what I want, and what I think a lot of people want, which is a social network primarily of people you know and give at least one shit about. But in reality, most of us don't have extended social networks that can provide enough content to consistently entertain us. So, even if we don't want 'outside' content (as if that was an option), we'll gravitate to it out of boredom and our feeds will gradually morph back into some version of the clusrterfucks we all deal with today.
Social media has turned out to basically be this.
The behavioral options are restricted to posting news headlines, reposting news headlines, or being passive. There’s no option to create original content, and no interventions centered on discouraging reposting. Facebook has experimented[0] with limits to reposting and found such limits discouraged the spread of divisive content and misinformation.
I mostly use social media to share pictures of birds[1]. This contributes to some of the problems the source article[2] discusses. It causes fragmentation; people who don’t like bird photos won’t follow me. It leads to disparity of influence; I think I have more followers than the average Mastodon account. I sometimes even amplify conflict[3].
[0] https://www.socialmediatoday.com/news/internal-research-from...
[1] https://social.goodanser.com/@zaktakespictures/
[2] https://arxiv.org/html/2508.03385v1#S3
[3] https://social.goodanser.com/@zaktakespictures/1139481946021...
This was my initial reaction as well, before reading the interview in full. They admit that there are problems with the approach, but they seem to have designed the simulation in a very thoughtful way. There really doesn't seem to be a better approach, apart from enlisting vast numbers of people instead of using LLMs/agent systems. That has its own problems as well of course, even leaving cost and difficulty aside.
There’s no option to create original content...
While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.
> the vast majority of users don't create original content
That's true now at least most of the time, but I think it's as much because of design and algorithmic decisions by the platforms to emphasize other types of content. Early Facebook in particular was mostly original content shared between people who knew each other. The biggest problem with that was it wasn't very profitable.
https://news.gallup.com/poll/467792/social-media-users-incli...
U.S. adults commonly engage with popular social media platforms but are more inclined to browse content on those websites and apps than to post their own content to them. The vast majority of those who say they use these platforms have accounts with them, but less than half who have accounts -- and even smaller proportions of all U.S. adults -- post their own content.
https://www.pewresearch.org/internet/2019/04/24/sizing-up-tw...
Most users rarely tweet, but the most prolific 10% create 80% of tweets from adult U.S. users
https://www.pewresearch.org/internet/2021/11/15/the-behavior...
The analysis also reveals another familiar pattern on social media: that a relatively small share of highly active users produce the vast majority of content.
> While this is true, I'd say the vast majority of users don't create original content either, but still end up shaping the social media environment through the actions that they did model. Again, it's not perfect but I'm more convinced that it might be useful after reading the interview.
Ok, but, this is by design. Other forms of social media, places like Mastodon etc have a far, far higher rate of people creating original content.
The fundamental problem with social media (and many other things) is humans, specifically our biological makeup and (lack of) overriding mechanisms. One could argue that pretty much everything we call 'civilised behavior' is an instance of applying a cultural override for a biological drive. Without it, we are very close to shit-flinging murderous apes.
For so many of our problems what goes wrong is that we fail to stop our biological drive from taking the wheel to the point where we consciously observe ourselves doing things we rationally / culturally know we should not be doing.
Now the production side of media/content/goods evolves very fast and does not have a similarly strong legacy biological drive holding it back, so it is very, very good (and ever improving) at exploiting the sitting duck that is our biological makeup (food engineering, game engineering etc. are very similar to social media engineering in this regard).
The only reliable defense against that is training ourselves to not give in to our biological drives when they are counterproductive. For some that might be 'disconnect completely' (i.e. take away the temptations altogether), but having a healthy approach to encountering the temptations is far more robust. I am of the opinion that labeling the social media purveyors and producers in general as evil abusers is not necessarily inaccurate, but counterproductive in that it tends to absolve individuals of their responsibility in the matter. Imagine telling a heroin addict: "you can't help it, it's those evil dealers that are keeping you hooked to the heroin".
No specific dynamics are named in the remainder of the article, so how are we supposed to know if they're "structurally embedded" in anything, let alone if we're doomed?
Think of it this way: you’re hosting a party, and an uninvited stranger kicks the door open, then starts criticizing how you make your bed. That’s about what it feels like to try to “fix” social media.
The era of sites with a phpbb forum and an irc channel was really fun for me and I miss it a lot
I made lots of friends that way in the past, close friends, and it's unlike anything I've encountered since then with social media
The main reason why you might think that way, is less the format, and probably more the moderation, and lower amount of people in those forums. I mean take this forum here, is far better than twitter or any other social media-sloop. Smaller Subreddits, especially with good moderation, are also far better than big Subreddits.
Also on the topic of moderation, I feel like less is more. Unless it's spam, child pornography, or threats of violence, it should probably just be left alone. Reddit in particular is extremely over moderated these days. Outside of a handful of subreddits, it is impossible to post conservative views because moderators will ban you for them. Just basic stuff like "I support <Republican candidate> and here's why..." results in a ban. Even just being subscribed to /r/conservative is enough to be automatically banned from several other subreddits. While there are exceptions, most of the moderators are in actuality just a bunch of petty, censorious tyrants.
But it's not chronological, it's posting order. It's pretty common to have sub-threads with different focus and timeflow emerging in the middle of linear threads
> Another big thing for me is that it doesn't have comment karma.
Implementation-detail. And Linear boards can also have karma-systems.
But on the other side, it supports a community-driven moderation.
> It becomes an "I disagree with your opinion" button.
Yes, that is a problem. There should be more options available, maybe a limited set of emojis to communicate the impression of something, instead of bland down/upvotes. It kinda works well on Github and Discord, but has less usage by the systems there.
> Also, threaded discussions encourage people to just highjack one of the top voted threads instead of replying to a more relevant person later down the line.
Which is not bad? At least you can ignore them. Linear has no option for this, all is the same mess.
> Also on the topic of moderation, I feel like less is more.
Depends on the topic, community and size, also the country. Illegal content has always to be moderated by law. And in general, offtopic has to be moderated.
> "I support <Republican candidate> and here's why..." results in a ban
Again, on or offtopic? Each subreddit has a focus, and if it's out of this focus, it's justified to remove it. For example, I would not like to see discussions about PHP in a C-Subreddit, this is simply a waste of my time and attention.
I only want to engage with people on Twitter if I specifically add that person to a whitelist. I don’t want to be subject to an algorithm that is trying to increase “engagement”. I don’t want more engagement with Twitter, I want to see random posts which an AI thinks are going to enrage me so I stay on the site longer.
This extension? https://github.com/rxliuli/mass-block-twitter
Worth reading Jaron Lanier's Ten Arguments for Deleting Your Social Media Accounts Right Now book:
https://www.amazon.com/Arguments-Deleting-Social-Media-Accou...
I don't mind Mastodon, but I'm pretty selective in who I follow, and diversity of opinions isn't one of my criteria.
I mean, that's fine, if you think that you can consider all conceivable angles thoroughly, by yourself. I for one welcome opposing views, but I suppose if my idea of that meant "religion or conspiracy theories" I'd probably be avoiding it too.
A little off topic, but where do you get your news? I am having the hardest time finding credible news sources that aren't full of misinformation or bias to the point of being Soviet Union Pravda levels of propaganda.
The best I've been able to do is pick a few sources that are left leaning and a few that are right leaning and try to glean the truth myself using critical thinking and for particularly important topics, more in depth research independently. The problem is that this is very time consuming and exhausting.
Where are good discussions between really different viewpoints anywhere?
They were not like group chats or subreddits, the circles were just for you, it was just an easy way to determine which of your followers would see one of your posts.
This kind of interaction was common in early Facebook and Twitter too, where only your friends or followers saw what you posted, often just whitelisted ones. It was not all public all the time. Google+ just made that a bit more granular.
I suppose that these dynamics have been overtaken by messaging apps, but it's not really the same thing. It's too direct, too real-time and all messages mixed-in, I like the more async and distributed nature of posts with comments.
Granted, if you really want a diverse discussion and to talk with everyone in the world at once, indeed that's a different problem and probably fundamentally impossible to make non-toxic, people are people.
You can't accuse them of hiding their bias and contradictions.
How can a single paper using a unproven (for this type of research) tech disprove such (alleged) skepticism.
People bending over backwards to do propaganda to harvest clicks.
Site culture is what prevents mods from having to step in and sort out every little disagreement. Modern social media actively discourages site culture and post quality becomes a race to the bottom. Sure its harder to onboard new users when there are social rules that need to be learnt and followed but you retain users and have a more enjoyable experience when everyone follows a basic etiquette.
As a species we are greedy, self serving, and short sighted.
Social Media amplifies that, and we are well on our way to destroying ourselves.
Banning CFCs, making seatbelts a legal requirement, making drink driving illegal, gun control (in countries outside the USA), regulations on school canteens. These are all examples of coordination where we've solved problems further upstream so that individuals don't have to fight against their own greedy, self-serving, short-sighted nature.
We do have the ability to fix this stuff, it's just messy.
We have raped this planet into a coma and our children will have to scrape together whatever remains when we are done.
Things can change.
A big problem I see is users in good faith are unable to hold back from replying to bad faith posts, a failure to follow the old "don't feed the trolls rule".
There is also research and promotion of values going on and the thing as a whole is entertaining and can be rigged or filtered on various levels by all participants.
It’s kind of social. The general point system of karma or followers applies and people can have a career and feeling of accomplishment to look back on when they retire. The cosmic rule of anything. too much, no good applies.
It’s not really broken but this is the age of idiots and monsters, so all bets are off.
Not that you won't have problems, even here, from time to time. But it is hard to argue that things aren't kept much more civil than in other spots?
And, in general, avoiding direct capital incentives to drive any questionable behavior seems a pretty safe route?
I would think this would be a lot like public parks and such. Disallow some commercial behaviors and actually enforce rules, and you can keep some pretty nice places?
But just like a public park, if 2 million people rock up it's going to be next to impossible to police effectively.
Not really. If 5 people can moderate 1000, surely 5000 can moderate 1 million. Divide et impera, it's not a new idea.
Just keep in mind that in a free market there is supposed to be no profit. If there is, then something is wrong. In this case the companies just don't feel like moderating and following laws.
For parks, this is somewhat mitigated by the fact that people have to physically be there. That alone is a bit of a moderating factor, I would presume. With online, even 5 people can't moderate 1000 bad faith collaborators?
I don't know that we truly have a way to ensure "person is on other side of this account." And in places that are made to be interfaced from corporations, that isn't even strictly the desire.
What I was arguing was that employing enough people to moderate should be just cost of business. If that would cost to much there should simply be no business. However the big social media business are right now far from going bankrupt.
When you have hundreds, thousands, or even millions of people in the same “park,” what kind of “ground rules” can we all truly agree on? It’s not like we’re gathered around the same dinner table, where a single moderator can keep things civil enough to avoid a brawl. Even then, heated arguments aren’t uncommon.
In an environment where one person’s truth can be another’s misinformation, I’m not sure moderation can ever be applied in a way that satisfies everyone involved.
I'll go further and say that I doubt there is a single set of static rules that will work for anyone.
And what would be the downside of that? :D
This is also how Mastodon works today, and it is fine.
Strip the platforms of any and all initiative. Make them dumb pipes. Stop pretending that people want to use social media for entertainment and news and celebrities. Stop trying to turn it into interactive TV. Stop forcing content from outside of my network upon me. Make the chronological feed the only option.
Social media is meant to be a place where you get updates about the lives of people you follow. You would visit several times a day, read all new updates, maybe post your own, and that's it. The rest of the time, you would do something else.
While we're at it, shall we stop storing data?
Congrats, you now have platforms no one will care about, as attention span gets sniped by competitors who want to maximize engagement and don't care about your arbitrary rules (ie, literally what happened 15 years ago).
There might be demand, but this "platform A" will be in competition with a dopamine-focused engagement "platform B" which also supports to host updates from "the lives of people you follow".
The majority of people will then have both installed but spend more time on "platform B" as it is actively engaging them.
Platform A will again end up in competition for user-attention with Platform B, as it needs money to operate their business etc.
Now if Platform A asks for a subscription fee to fund their non-engagement social media platform, how many of these users described above will pay and not simply prefer the free "platform B"?
How will such a churn affect users willing to PAY for "Platform A", if people whose "life they want to follow" have completely moved to "Platform B"?
Funny enough, as a European I could use WhatsApp as this "Platform A", as it has features to share status-updates, pictures etc. as part of your profile. Everyone I know has WhatsApp, noone is using those features.
So in essence, this Platform A already exists in Europe but doesn't work as "social media" because people don't engage with it...
And why would that be a problem? Most people also spend more time sleeping than using social media, so what? Let them be. Give them a tool that they would decide how to use best to suit their lifestyle.
> Platform A will again end up in competition for user-attention with Platform B, as it needs money to operate their business etc.
It would not, because it would not be run by a commercial organization. At this point I'm convinced that it's impossible for a sane social media platform to exist unless it's decentralized and/or run by a nonprofit. As soon as one touches venture capital, enshittification becomes only a matter of time.
I'm referring to the journey that will move people away from "Platform A" again because of Platform B. That's a problem to solve because the value of the social network to an individual is largely the PEOPLE on that network.
> It would not, because it would not be run by a commercial organization.
Agree, taking "platform A" out of the profit-wheel could help. But still in a normal adoption scheme you need to make it worthwhile for a critical mass of people to use it for OTHER people to also consider it.
In the end I believe you will need another external driver to solve this (i.e. restricting for-profit social media), because that other platform "has all the people and the dopamine".
-
There is a more simple parallel situation one can observe: People trying to move from WhatsApp/FB-messenger/.. to i.e. Signal.
It's JUST about direct messaging, but anecdotally the majority of these transitions fail to complete because not all involved are actually on-board to abandon the old Messenger.
So you succeed and your close group installs Signal and starts to use it with you, but each one is also part of other groups who are still also on the legacy app. Everyone has both apps installed, but slowly communication starts to move back to the legacy app because it's the "superset" of all friend-groups.
I would care, and I imagine there are others who would too. I don’t use social media anymore (at all!) because of this. If I could have the chronological feed restored and no intrusion of other content I’d redownload immediately. There must be a market for this.
The issue of course is that your friends won't be on it, most of them won't sign up even if you beg them, and most likely none of you will be using the service anymore 6 months from now.
I dropped out of these platforms some years ago and have been very happy having the Fediverse (and HN!) as my only social media. It is just the right amount of engagement and impulse for me. I do not check my feeds compulsively, but occasionally – and the people I follow is a diverse bunch, giving me food for thought and keeping me up to date with topics and software projects.
It is still a niche place to hang out, but I'm OK with that. Now and then, friends get curious enough to join and check it out.
Every other form of mass media is regulated, for good reason.
That doesn't mean we need to one-up each other and repeatedly hyper optimize until we're making the digital equivalent of fentanyl-laced cigarettes.
Also, we can actually ban stuff. I know it's not a physical product, but that doesn't mean we just have to fucking put up with whatever and deal with it.
We'd never let a company sell fentanyl-laced cigarettes. But when it comes to tech, we all just throw up our hands and say "well we tried nothing and we're all out of ideas!"
wait are you talking about social media or sites that play videos?
People actually want media, social and otherwise, for exactly that.
> Stop forcing content from outside of my network upon me.
There are social media apps and platforms that don't do that. They are persistently less popular. People, by and large, do want passive discovery from outside of their network, just like they do, in aggregate, want entertainment and news and celebrities.
> Make the chronological feed the only option.
Chronological by what? Original post? Most recent edit? Most recent response? Most recent reaction?
But addictions are wonderfully useful politically, so that's unlikely to happen.
The point is simple - an algorithm is a form of meta-content. It's curated and designed, not neutral. And so are its commercial, psychological, and political effects.
Currently SM companies have been allowed to use algorithms with little or no oversight. The emphasis has been on regulating privacy, not influence.
In the same way the media need to have a Fairness Doctrine restored to restore sanity and quality to journalism, algorithm providers need to be able to demonstrate an equivalent for their platforms.
This is very much against the spirit of the times, but that spirit is algorithmically created - which just proves the point.
If you're thinking "Yes, but government..." - how do you know that's a spontaneous original thought, and not something you've been deliberately conditioned to believe?
I think that's incorrect. Many addicts despise their addiction. A better way to look at it is: people can get addicted easily. Nobody gets addicted to paper press lubricant. The addiction is initiated by a positive experience, which often means: pleasure. Paper press lubricant doesn't provide that, but alcohol and facebook do.
It may not be much of a distinction, but sometimes it helps to think about/see it in another way.
The human mind is more easily exploitable than any computer.
Of course people want to see divisive content. Divisive content is engineered to trigger large emotions, which humans naturally respond to. That's what heroin exists - it feels good. Really, really good. Of course you don't need only good feelongs, bad feelings can shape behavior too.
The problem comes in when we start shifting from making products that simply cause these behavior changes to instead making products engineered to elicit the biggest response possible.
Its very easy to do, too. Want to sell more cigarettes? Add butane to the wrapping so it burns better. Concentrate the nicotine. Do all the obvious stuff. The more bad you make it, the better it will sell.
Same thing with, say, Facebook. Want people to use Facebook more? More doomerism, more lies, more misinformation, more Jesus Christ, more click bait. People want the addictive stuff, that's why we call it addictive.
So yes, social media apps without these tools are unpopular and generally fail. In the same way a nicotine-free cigarette would fail.
I think that's a bit murky. Facebook became popular when it was like that. They changed how it worked after it was already popular, seeking to make more mony.
Facebook could provide you options to only see friend's content. People have certainly asked for it. They absolutely refuse to.
Algorithms only make them worse, not bad. The flaws are there, because people.
> Stop pretending that people want to use social media for entertainment and news and celebrities.
No, people buy this content, because they want it, not because it's there.
> Social media is meant to be a place where you get updates about the lives of people you follow.
Strange claim. Nothing is preventing people from doing exactly this.
I try my best to do this but it's futile.
See, even if you don't use the algorithm, the algorithm still uses you. So, for example, you can't just discuss something on Twitter with your followers. Sometimes it would decide to show your tweet to a bunch of random people with the radical version of the views opposite to yours, and you will be forced to use the block button way too damn much. You can't opt out of having your tweets recommended to strangers.
Even when that doesn't happen, many of your followers would still miss your posts if you don't appease the algorithm because they would still use the algorithmic feed — whether knowingly or because of all the dark patterns around that setting (it's very hidden, never sticks, or both).
So no, the very existence of the algorithmic feed is the problem, it ruins the experience for everyone regardless of whether they use it or not.
That is a problem with the platform. Nothing is stopping you from changing the platform. You want it to be different, but don't want to use the different that already exists?
> Even when that doesn't happen, many of your followers would still miss your posts if you don't appease the algorithm because they would still use the algorithmic feed
That's still their own "choice". They decide to use Twitter, and they decide to stay on the algorithmic view. But ok, to be fair, Twitter, and other big networks, are kinda unstable on which view they offer and use by default. Forcing them to give more stable control to the user would be good. But I doubt it would fix anything on the grand scale.
> So no, the very existence of the algorithmic feed is the problem
The algorithmic feed exists mainly because there is too many content, so you will miss something anyway. Removing it will fix nothing for most people, it will only change what you miss. People have used self-configured algorithms even before social media existed. The demand has always been there.
But of course, we could talk about the actual implementation and it's dark leanings.
I thought the latest research had debunked this and showed that the _real_ source of conflict with social media is that people are forced out of their natural echo-chambers and exposed to opinions that they normally wouldn't have to contend with?
>Can we identify how to improve social media and create online spaces that are actually living up to those early promises of providing a public sphere where we can deliberate and debate politics in a constructive way?
they really pomp up what is effectively a message board (facebook, twitter) or a video website with a comment/message feature (youtube, tiktok) or an instant messenger with groups (whatsapp). NONE OF THIS IS NEW.
It's very misguided to pretend that social media mobs would replace "the press". There is a reason the press exists in the first place, to inform critically , instead of listening to hearsay.
duxup•2d ago
But here's the thing ... people CHOOSE to engage with that, and users even produce that content for social media platforms for free.
It's hard to escape that part.
I remember trying Bluesky and while I liked it better than Twitter, for me it was disappointing that it was just Twitter, but different. Outlandish short posts, same lame jokes / pithy appeals to our emotions, and so on. People on there want to behave the same way they wanted to on Twitter.
PaulHoule•2d ago
duxup•2d ago
Facebook is not my page, it looks nothing like I want... my content is in many ways the least important thing featured.
derbOac•2d ago
I feel exactly the same way.
I think there needs to be a kind of paradigm shift into something different, probably something that people in general don't have a good schema for right now.
Probably something decentralized or federated is necessary in my opinion, something in between email and twitter or reddit? But there's always these chicken and egg issues with adoption, who are early adopters, how that affects adoption, genuine UX-type issues etc.
9rx•2d ago
So, Usenet? The medium is the message and all that, sure, but unless you change where the message originates you are ultimately going to still end up in the same place.
johnnyanmac•1d ago
notTooFarGone•2d ago
brains are wired that way. Gossip and rage bait is not something that people actively decide for, it's subconscious. It's weird saying that this is the problem of individuals - propaganda is effective not because people are choosing to believe it.
prisenco•2d ago
At the scale we're operating, if only 1% is susceptible to these algorithms, that's enough to translate to noticeable social issues and second-order effects.
And it's not 1%.
KaiserPro•2d ago
Kinda, but they also don't really realise that they have much more control over the feed than they expect (in certain areas)
For the reel/tiktok/foryou-instagram feeds, it shows you subjects that you engage with. It will a/b other subjects that similar people engage with. Thats all its doing. continual a/b to see if you like what ever flavour of bullshit is popular.
Most people don't realise that you can banish posts from your feeds by doing a long press "I don't like this" equivalent. It takes a few times for the machine to work out if its an account, groups of accounts of theme that you don't like, and it'll stop showing it to you. (threads for example took a very long time to stop showing me fucking sports.)
Why don't more people know this? because it hurts short term metrics for what ever bollocks the devs are working on. so its not that well advertised. just think how unsuccessful the experiments in the facebook app would have been if you were able to block the "other posts we think you might like" experiments. How sad Zuckerberg would be that his assertion was actually bollocks?
RiverCrochet•2d ago
To be fair, in times far past, you really didn't have much choice in TV or radio channels, and I suspect it's this demographic that tend to just scroll down Facebook and take what it gives without much thought other than pressing Like on stuff.
Mouvelie•2d ago
CrimsonCape•2d ago
pkamb•1d ago
KaiserPro•1d ago
In instagram, its very different.
First there is "snooze suggested content" which gives you a pure follow feed.
However once you reach the end of that and go into the "for you" feed, which has one "personality". Then there is the explore page, which has another "personality"
The new reels carousel stuff I think is possibly another personality.
So there are now three places where you need to yeet stuff you don't like.
I noticed that when the reels carousel was introduced they went heavy into thirst traps.
But again, this is a regulation issue. If this was the 1980s, there would be a moral panic causing something like the v-chip to stop "the youth" getting access to soft porn (not that it worked that well) Now it'll be a executive fatwa, which'll be reversed when he gets distracted by something else.
PaulHoule•2d ago
Once I get my database library reworked, a project I have in the queue is a classifier which filters out negative people so I can speed follow and not add a bunch of negativity to my feed, this way I get to enjoy real gems like
https://mas.to/@skeletor
Cross posting that would cure some of the ills of LinkedIn!
Scrounger•2d ago
FWIW, I've been consistently posting quality stuff on Bluesky for the last year, and despite having a few hundred followers, I get ZERO engagement.
People in the Bluesky subreddit tell me it's not a "post and ghost" platform in that you have to constantly interact with people if you want to earn engagement, but that's too time consuming.
In other words, the discovery algorithm(s) on BlueSky sucks.
immibis•2d ago
johnnyanmac•1d ago
If we believe the discoverability algorithms to avoid "engagement" is respected, who would be more discoverable? The person coming in to show off one high quality article every 6 months, or the person doing weekly blogs with some nuggets of information on the same topic?
Maybe your article goes viral, but odds are that the weekly blogger will amass more followers, have more comments, and will build up to a point where they 99% of the time get more buzz on their updates than the one hit wonder.
avgDev•2d ago
It really isn't a choice. It is very accessible. Many friends are on social networks and you slowly get sucked into shorts. Then, it becomes an addiction as your brain crave the dopamine hits.
Similar to what Howard Moskowitz did with food.
pixl97•2d ago
johnnyanmac•1d ago
In the same was a smoker "chooses" to engage with cigarettes. Let's not underestimate the fact that core human programming is being exploited to enable such behavior. Similar to telling a smoker to "just out the cigsreet down", we can't just suddenly tell people in social media to "stop being angry".
>people on [BlueSky] want to behave the same way they wanted to on Twitter.
Yes. Changing established habits is even harder to address. You can't make a horse drink (I'm sure anyone who ever had to deal with a disengaged captive audience feels this in their souls). Whike it's become many peoples primary "news source", aka the bread, most people came there for the circus.
I don't really have an answer here. Society needs to understand social media addiction the same way they understand sugar addictions; have it slammed in there that it's not healthy and to use sparingly. That's not something you can fix with laws and regulation. Not something you fix in even a decade.
slightwinder•1d ago
Technically correct, but choice is here very simplified. The system is unable to understand WHY people engage with something, and in which way. That's poisoning the pool, and enforcing certain content and types of presentation.