But not just due to astroturfing, but their users are also just pretty crazy and exist in a extremely narrow bubble.
But now, even reading it is basically an exercise in fan-fiction curation. It's still mildly interesting to read things like r/pettyrevenge, or r/amitheasshole, but I fully recognize that they're the equivalent of watching a daytime soap - including the fact that most of them are fiction, and it's very hard to distinguish between real stories and ones made up by people or bots.
I'm not saying I like it or that I want it. But I'm not seeing how else to fix the problem.
It will also be necessary to enforce sanctions on people who violate the trust of a community. The stick needs to be equal to the task of driving people toward the carrot.
https://www.reuters.com/technology/artificial-intelligence/s...
1. Allowing multiple identities & pseudonyms - maybe I don't want my identity as @furry_mpreg_artist in one discord to be directly linked to @database_expert on StackOverflow or to @local_bicycle_repairperson on my local facebook group?
2. People obtaining these digital passports, and immediately giving/selling/losing them to some bot who can juggle those bits as easily as a human can.
3. What happens if you lose access to your passport? A toddler spills cranberry juice into your desktop, an adversary hacks your computer, you have a psychotic break and overwrite your hard drive platters with magnets & hammers?
Which is totally fair.
1. This is not a problem but a feature request that initially is not covered by the proposal (ie no pseudonyms)
2. This is not a relevant problem (red herring) because stealing an identity is already illegal
3. The same password or identity recovering steps as usual (see: FSAID)
2. I disagree. If the point of the proposed solution is to prevent bots from impersonating humans, it's gotta have some mechanism to actually solve that problem - and if a human can hand over the keys to their identity to a bot, then the solution absolutely does not solve the problem. (And identity theft was only one of the ways a bot could get a human's identity.)
The contrast between large, mostly-anonymous communities like Reddit, HN, Bluesky, etc. and smaller group chats, forums and Discords where people actually know each other is stark. When a person you've played games and chatted with for years expresses a political opinion you disagree with, your reaction is very different from when some X/Bluesky user you've never seen before says that same political opinion in a reply to your (tw/sk)eet.
if you read one story in isolation, it seems fine. you'll think that it's a wild story (which is why it was worth sharing) that probably has a bit of dramatic license, but fine. but if you binge through the top posts, you'll start to notice all sorts of common tropes and patterns and they all seem fake and formulaic. the stories are highly effective and hooking people in, so much so that there's a whole genre of cross-platform slop posts where somebody screenshots reddit threads into facebook for driving engagement.
but the posts dont feel bot/LLM-made, but it does feels a lot like some group of people are mixing-and-matching different scenarios to do... something. and i want to know what the something is.
- is it purely an engagement farm (if so, what's the value in accruing non-transferrable reddit karma?)
- is it some sociological experiment that's trying to understand the reddit psyche?
- or is it that lots of different individuals genuinely just enjoy a creative writing challenge?
I've made stuff up and posted it. It's way too easy to troll those subreddits.
Would anyone dare even jokingly argue that reddit stands for free speech today? Reddit declined bigtime but something happened that Digg never had. Bots creating the illusion the site isnt dying. This has obviously created the 'dead internet theory'
Reddit's 'sociality' has been gone for many years. I don't know who operates the bots but few months ago when i checked, the majority of people on reddit were bots.
Then again it's no better on X or facebook. The bots have taken over for what goal? It's not immediately obvious to me. I dont even think its intending to keep reddit alive. Law enforcement and Government no doubt significant portion of the bots. Seeking speech criminals; which is fair. Ive seen many arrest videos of people who had made death threats on social media
Advertising for either products or political ideas
Judging by some of the examples listed by Reddit mods, it seems political astroturfing is a serious goal:
"(This is one of the AI comments used in the experiment.)
As a Palestinian, I hate Israel and want the state of Israel to end. I consider them to be the worst people on earth. I will take ANY ally in this fight."
These topics are fiercely guarded by bots on youtube comments, facebook comments, X, etc.
superkuh•9mo ago
footlose_3815•9mo ago
bigstrat2003•9mo ago
drivingmenuts•9mo ago