frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Cosmologically Unique IDs

https://jasonfantl.com/posts/Universal-Unique-IDs/
196•jfantl•3h ago•53 comments

Tailscale Peer Relays is now generally available

https://tailscale.com/blog/peer-relays-ga
263•sz4kerto•5h ago•116 comments

Zero-day CSS: CVE-2026-2441 exists in the wild

https://chromereleases.googleblog.com/2026/02/stable-channel-update-for-desktop_13.html
197•idoxer•5h ago•103 comments

DNS-Persist-01: A New Model for DNS-Based Challenge Validation

https://letsencrypt.org/2026/02/18/dns-persist-01.html
121•todsacerdoti•3h ago•51 comments

R3forth: A concatenative language derived from ColorForth

https://github.com/phreda4/r3/blob/main/doc/r3forth_tutorial.md
31•tosh•2h ago•2 comments

What is happening to writing? Cognitive debt, Claude Code, the space around AI

https://resobscura.substack.com/p/what-is-happening-to-writing
45•benbreen•6h ago•13 comments

The Perils of ISBN

https://rygoldstein.com/posts/perils-of-isbn
28•evakhoury•4h ago•6 comments

Metriport (YC S22) is hiring a security engineer to harden healthcare infra

https://www.ycombinator.com/companies/metriport/jobs/XC2AF8s-senior-security-engineer
1•dgoncharov•55m ago

Pocketbase lost its funding from FLOSS fund

https://github.com/pocketbase/pocketbase/discussions/7287
93•Onavo•5h ago•56 comments

If you’re an LLM, please read this

https://annas-archive.li/blog/llms-txt.html
686•soheilpro•14h ago•325 comments

Learning Lean: Part 1

https://rkirov.github.io/posts/lean1/
50•vinhnx•3d ago•6 comments

Portugal: The First Global Empire (2015)

https://www.historytoday.com/archive/first-global-empire
32•Thevet•14h ago•23 comments

Show HN: Rebrain.gg – Doom learn, don't doom scroll

8•FailMore•9h ago•0 comments

Terminals should generate the 256-color palette

https://gist.github.com/jake-stewart/0a8ea46159a7da2c808e5be2177e1783
445•tosh•15h ago•178 comments

A solver for Semantle

https://victoriaritvo.com/blog/semantle-solver/
16•evakhoury•2h ago•2 comments

Discrete Structures [pdf]

https://kyleormsby.github.io/files/113spring26/113full_text.pdf
26•mathgenius•2h ago•2 comments

Womens Sizing

https://pudding.cool/2026/02/womens-sizing/
5•zdw•37m ago•0 comments

What Every Experimenter Must Know About Randomization

https://spawn-queue.acm.org/doi/pdf/10.1145/3778029
19•underscoreF•2h ago•6 comments

There is unequivocal evidence that Earth is warming (2024)

https://science.nasa.gov/climate-change/evidence/
104•doener•1h ago•88 comments

Assigning Open Problems in Class

https://blog.computationalcomplexity.org/2026/02/assigning-open-problems-in-class.html
4•baruchel•2d ago•0 comments

Delphi is 31 years old – innovation timeline

https://blogs.embarcadero.com/delphi-innovation-timeline-31st-anniversary-edition-published-get-y...
48•andsoitis•5d ago•16 comments

Cistercian Numbers

https://www.omniglot.com/language/numbers/cistercian-numbers.htm
45•debo_•5h ago•7 comments

Show HN: VectorNest responsive web-based SVG editor

https://ekrsulov.github.io/vectornest/
59•ekrsulov•6h ago•22 comments

Garment Notation Language: Formal descriptive language for clothing construction

https://github.com/khalildh/garment-notation
121•prathyvsh•6h ago•34 comments

The true history of the Minotaur: what archaeology reveals

https://www.nationalgeographic.fr/histoire/la-veritable-histoire-du-minotaure-ce-que-revele-arche...
27•joebig•3d ago•10 comments

SkyRL brings Tinker to your GPUs (2025)

https://novasky-ai.notion.site/skyrl-tinker
20•robertnishihara•5d ago•1 comments

Show HN: Formally verified FPGA watchdog for AM broadcast in unmanned tunnels

https://github.com/Park07/amradio
50•anonymoosestdnt•6h ago•15 comments

Show HN: CEL by Example

https://celbyexample.com/
60•bufbuild•7h ago•31 comments

Fastest Front End Tooling for Humans and AI

https://cpojer.net/posts/fastest-frontend-tooling
89•cpojer•10h ago•45 comments

Native FreeBSD Kerberos/LDAP with FreeIPA/IDM

https://vermaden.wordpress.com/2026/02/18/native-freebsd-kerberos-ldap-with-freeipa-idm/
101•vermaden•11h ago•49 comments
Open in hackernews

The political effects of X's feed algorithm

https://werd.io/the-political-effects-of-xs-feed-algorithm/
69•benwerd•1h ago

Comments

rbanffy•1h ago
And this is why the price for Twitter was, in the end, remarkably low.
cowpig•1h ago
dark
piloto_ciego•1h ago
Yeah, this was always the play looking back in hindsight. Like, I didn't get it, "why would you pay that kind of money for a web forum?!" It wasn't the forum that was important, Twitter (for better or worse) has wormed it's way into the fabric of American discourse. He was basically buying the ideological thermostat for the country and turning the dial to the right.
dlev_pika•54m ago
This is even worse outside of NA. In many countries it is the de facto communication channel of government and businesses.
RetpolineDrama•47m ago
Or from the other perspective: Meta and Google have had their finger on the scale for more than a decade (along with old twitter).

In twitters case, you had regime officials directing censorship illegally through open emails and meetings.

It's no surprise that the needle moves right when you dial back the suppression of free expression even a little bit (X still censors plenty)

dylan604•40m ago
How is it illegal? It is their platform to do what they want with it. You can disagree and not use it, but it is theirs to do with as they see fit. If this was a government run operation paid for with tax dollars, then it would be an issue.
0ckpuppet•42m ago
as opposed to the government funding turning it to the reality bending left? There was direct communication from Senators and members of Congress directing twitter to block and ban based on certain topics. And Twitter obliged.
barfiure•1h ago
This person is confused. Trump was a well known pussy grabber for decades. Epstein was anything but a secret, it seems, given how many politicians and celebrities and moguls he rubbed elbows with. Jerry stopping by the island for a lemonade and a spot of lunch with his high school aged girls? Yeah.

It comes down to this: you can have visibility into things and yet those in power won’t care whatsoever what you may think. That has always been the case, it is the case now, and will continue to be in the future.

jongjong•1h ago
This is a defeatist attitude. Don't know what bubble you're in but these official revelations are driving real change in mine. It's kind of subtle at this point but it's the kind of change that cannot be undone.
Herring•35m ago
Unfortunately speed often matters when it comes to outcomes. Eg if you get a cancer diagnosis like Jobs, you probably shouldn’t waste a year drinking juices and doing acupuncture.
arwhatever•1h ago
I deleted my account after many years when X recently made the Chronological Feed setting ephemeral, defaulting back to the Algorithmic Feed each time the page is refreshed.

No away I'm going to let that level of outrage-baiting garbage even so much as flash before my eyes.

socalgal2•48m ago
I just click "following" at the top and never see anything I didn't ask to see. It resets once every few months to the other tab which I assume is just the cookie setting expiring.
dagelf•32m ago
Train it: I just have to spend 3 minutes every other year to tap the 3 dots on every post and choose "Not Interested", for an epic feed unmatched anywhere.
quirkot•9m ago
Train the algorithm so that you can be the sort of product you want to see in the world
mikepurvis•1h ago
"We need more funding into open protocols that decentralize algorithmic ownership; open platforms that give users a choice of algorithm and platform provider; and algorithmic transparency across our information ecosystem."

This sounds like a call to separate the aggregation step from the content. Reasonable enough, but does it really address the root cause? Aren't we just as polarized in a world where there are dozens of aggregators into the same data and everyone picks the one that most indulges their specific predilections for engagement, rage, and clicks?

What does "open" really buy you in this space?

Don't get me wrong, I want this figured out too, and maybe this is a helpful first step on the way to other things, but I'm not quite seeing how it plays out.

bee_rider•49m ago
I’d hope people wouldn’t intentionally pick the political extremism feed if they had any other option (although it’s hard to say).
tarxvf•31m ago
From where I'm sitting, it seems obvious people do exactly that.
Aurornis•1h ago
I don’t know if I buy the explanation that this was due to the feed algorithm. It looks like an artifact of being exposed to X’s current user base instead of their old followers. When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.

So changing your feed to show popular posts on the platform instead of just your friends’ Tweets would be expected to shift someone’s intake toward the average of the platform.

SecretDreams•1h ago
Is this the result of a feedback loop from musk joining or did they just accelerate the overall decline of the platform with him joining? Some might say it was going this way even before he picked it up, but it was certainly an inflection point when he joined either way.

All modern social media is pretty toxic to society, so I don't participate. Even HN/Reddit is borderline. Nothing is quite as good as the irc and forum culture of the 2000s where everyone was truly anonymous and almost nobody tied any of their worth to what exchanges they had online.

bpodgursky•55m ago
The moderation changes absolutely changed posting behavior. People got banned for even faintly gesturing the wrong direction on many issues and it frightened large accounts into toeing the line.
tokyobreakfast•43m ago
> Even HN/Reddit is borderline.

It's the proliferation of downvoting. It disincentivizes speaking your honest opinion and artificially boosts mass-appeal ragebait.

It's detrimental to having organic conversations.

"But the trolls" they say.

In practice it's widely abused.

Using HN as an example, there are legitimate textbook opinions that will boost your comment to the top, and ones that will quickly sink to the bottom and often be flagged away for disagreement. Ignoring obvious spam which is noise, there is no correlation to "right" or "wrong".

That's one advantage old-school discussion forums and imageboards have. Everyone there and all comments therein are equally shit. No voting with the tribe to reinforce your opinion.

What's worse is social media allowed the mentally ill to congregate and reinforce their own insane opinions with plenty of upvotes, which reinforces their delusions as a form of positive feedback. When we wonder aloud how things have become more radicalized in the last 20 years — that's why. Why blame the users when you built the tools?

SecretDreams•30m ago
I like voting (up and down) but I also agree with your take. Reddit salts the votes, but maybe the solution is to allocate a certain amount of reasonable votes (up or down) total that a user can use weekly. Make it so when you are voting, it's much more meaningful and truely reflect an opinion you either really agree with or really do not agree with.

Ultimately, I think it comes back to people value their online persona way too much and this is something we've intentionally marched towards.

excalibur•52m ago
I don't know what changes have been made more recently, but I know there was a definite change to the Twitter algorithm a few months ago that filled the feeds of conservatives with posts from liberals and vice versa. It seemed to be specifically engineered to provoke conflict.
noelsusman•44m ago
I'm not sure what your point is. How is "being exposed to X's current user base instead of their old followers" not equivalent to "turning on the feed algorithm"? You doubt the effect is due to the algorithm, but your alternative explanation describes exactly what the algorithm does.
jongjong•1h ago
Oh my. Now that X is affecting people's politics (for the better IMO), suddenly people care about the influence of algorithms over politics...
beanjuiceII•46m ago
yep but you wont find common sense on the matter here unfortunately
GorbachevyChase•24m ago
I am shocked, shocked to find that there is social engineering in this establishment!
fluoridation•1h ago
I honestly don't understand how or why people are using Twitter to keep up with the news. The only thing I use it for is to follow artists, and even that has been going down in recent weeks with most of my favorites moving over to BlueSky. Maybe I'm just a long-winded idiot, but the character limits barely let me have a conversation on either platform. How are people consuming news like this?

It just baffles me how different my experience of using the platform is. I literally do not see any news. I'm not entirely convinced that it's Twitter being biased and not just giving each person what they most engage with.

rishabhaiover•1h ago
It used to be self-expression in an oddly entertaining way but that Nikita Bier ruined the whole thing with his metrics chasing algorithmic shifts.
joe_mamba•1h ago
>I honestly don't understand how or why people are using Twitter to keep up with the news.

Because the MSM news stations themselves pick up the stuff from twitter and just add their own spin flavor. A dozen phone videos from random citizens on-site is always quicker than the time CNN/FOX can send a reporter there. On twitter you at least get the raw footage and can judge for yourself before MSM try to turn it political to rage bait you.

stevage•56m ago
You follow artists, and they are not tweeting their political opinions? Cool.

I gave up on Twitter when everyone I followed kept adding politics. Even if I agreed with it, I just don't want to a marinade in the anger all day.

fluoridation•47m ago
The one exception I think of is the guy from Technology Connections, who I stopped following because I got tired of seeing him in my feed complaining about something or other. And I've noticed he's been putting that into his videos as well, so I might have to do it on YouTube as well.
Flere-Imsaho•49m ago
My feed (UK based) seems to give me the major news stories well before the mainstream (BBC), and I'm taking days if not weeks in some cases. Now it could be that's how the mainstream decides to cover a particular story? What's worrying is when a story is all over X but isn't covered.

To give an example, the recent protests in Iran where being covered on X but the BBC was silent for weeks before finally covering the story (for a few days).

dylan604•36m ago
Could it also be that "mainstream" news are actually trying to verify information and/or obtain confirmation from other sources? All of that is done in an attempt to avoid promoting false information. People tweeting do not do any of that
ppeetteerr•1h ago
Why anyone is still using X after 2025 is a mystery (I know, it's where everyone is, but the moral implications are wild)
spankalee•1h ago
Seriously. The CEO is opening posting white supremacist content like it's Stormfront. If you don't support that, you should get out.
Herring•50m ago
I don’t know which country you’re in, but in the US Trump won the popular vote. Plenty of people here are perfectly happy with Stormfront.
spankalee•41m ago
I think the idea that if you don't support white supremacy you should get off the site owned and run by a clear white supremacist applies regardless of how elections go.
daveguy•35m ago
Less than you might think.

He didn't win a majority of the vote, just a plurality. And less than 2 of 3 eligible voters actually voted. So he got about 30% of the eligible population to vote for "yay grievance hate politics!" Which is way more than it should be, but a relatively small minority compared to the voter response after all ambiguity about the hate disappeared. This is why there's been a 20+ point swing in special election outcomes since Trump started implementing all the incompetent corrupt racist asshatery.

apparent•13m ago
"If everyone had voted, Trump still would have won" (by an even wider margin)

https://www.npr.org/2025/06/26/nx-s1-5447450/trump-2024-elec...

haunter•47m ago
Live update for sport events. People post highlights and replays before anyone else.
dagelf•28m ago
I didn't get it either until I trained the algorithm to feed me what I want by just clicking the three dots and selecting Not Interested on anything I never wanted to see again... it listens, whats left is really unmatched anywhere, I've really looked, and occasionally still do out of curiosity.
apparent•14m ago
Lots of info is shared there first. It shows up in news articles and podcasts 12-24 hours later. Not everything shared there is true, of course, so one has to do diligence. But it definitely surfaces content that wouldn't show up if I just read the top 2-3 news websites.
jmugan•1h ago
Oddly enough, X is the only platform i've been able to teach to not show me culture war stuff, from either side. It just shows me AI in the "For You."
kypro•39m ago
The uncomfortable truth to most "the algorithm is biased" takes is that we humans are far more politically biased than the algorithms and we're probably 90% to blame.

I'm not saying there is no algorithmic bias, and I tend to agree the X algorithm has a slight conservative bias, but for the most part the owners of these sites care more about keeping your attention than trying to get you to vote a certain way. Therefore if you're naturally susceptible to cultural war stuff, and this is what grabs your attention, it's likely the algorithm will feed it.

But this is far more broad problem. These are the types of people who might have watched political biased cable news in the past, or read politically biased newspapers before that.

quirkot•10m ago
the issue brought up in the article isn't that "the algorithm is biased" but that "the algorithm causes bias". A feed could perfectly alternate between position A and position B and show no bias at all, but still select more incendiary content on topic A and drive bias towards or away from it.
PaulHoule•38m ago
I've been pretty consistent about telling Bluesky I want to see less of anything political and also disciplined about not following anybody who talks about Trump or gender or how anybody else is causing their problems. I see very little trash.
jmugan•31m ago
Maybe it has gotten better recently. I tried and tried with Bluesky, but it would not abide.
guywithahat•38m ago
I have the same thought, my X algo has become less political than HackerNews. I suppose it depends on how you use it but my feed is entirely technical blogs, memes, and city planning/construction content
01HNNWZ0MV43FF•57m ago
You have to find good people. Bad people will find you.
cyrusradfar•54m ago
Lovely thought Ben. Good to hear from you!

I spent a lot of my life and money thinking about building better algorithms (over five years).

We have a bit of a chicken / egg problem. Is it the algorithm or is it the preference of the Users which is the problem.

I'd argue the latter.

What I learned which was counter-intuitive was that the vast majority of people aren't interested in thinking hard. This community, in large part, is an exception where many members pride themselves on intellectually challenging material.

That's not the norm. We're not the norm.

My belief that every human was by their nature "curious" and wanting to be engaged deeply was proven false.

This isn't to claim that this is our nature, but when testing with huge populations in the US (specifically), that's not how adults are.

The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.

socalgal2•49m ago
It is not the norm here either.
stetrain•40m ago
> Is it the algorithm or is it the preference of the Users which is the problem. I'd argue the latter.

> Algorithms serve what Users engage with

User engagement isn't actually the same thing as user preference, even though I think many people and companies take the shortcut of equating the two.

People often engage more with things they actually don't like, and which create negative feelings.

These users might score higher on engagement metrics when fed this content, but actually end up leaving the platform or spending less time there, or would at least answer in a survey question that they don't like some or most of the content they are seeing.

This is a major reason I stopped using Threads many months ago. Their algorithm is great at surfacing posts that make me want to chime in with a correction, or click to see the rest of the truncated story. But that doesn't mean I actually liked that experience.

cyrusradfar•11m ago
> People often engage more with things they actually don't like, and which create negative feelings.

Do you think this is innate or learned? And, in either case, can it be unlearned.

prometheus76•37m ago
All you need to do is read the other comments on this very page and you will see that there are very strict cultural and political norms here too, but for some reason they are invisible as such to those who hold them. They consider their views to be "common knowledge" and "what any reasonable person believes" because they, too, live in curated bubbles.

Any comment that challenges mainstream science, materialism/physicalism, and leftist politics gets downvoted into oblivion here because HN is definitely not a haven for people who "pride themselves on intellectually challenging material."

TL;DR: It's an echo chamber here, too, but most people who hold the worldview that is enforced here often cannot see their own presuppositions, nor do they see that their views are political in nature.

sonofhans•34m ago
I’d expect this to be down-voted too. It has nothing to do with the article and makes no concrete claims. It’s easy to slag anything in vague terms, but it adds nothing to this discussion.
rl3•24m ago
>Any comment that challenges mainstream science ...

Stupid mainstream science.

>... and leftist politics ...

>... nor do they see that their views are political in nature.

You don't say. Personally, I respect comments that prove their own claims.

sonofhans•35m ago
> Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.

Algorithms have been adapted; they are successful at their goals. We’ve put some of the smartest people on the planet on this problem for the last 20 years.

Humans are notoriously over-sensitive to threats; we see them where they barely exist, and easily overreact. Modern clickbait excels at presenting mundane information as threatening. Of course this attracts more attention.

Also, loud noises attract more attention than soft noise. This doesn’t mean that humans prefer an environment full of loud noises.

rl3•34m ago
>This community, in large part, is an exception where many members pride themselves on intellectually challenging material.

That's not the norm. We're not the norm.

I recommend against putting HN on a pedestal. It just leads to disappointment.

cyrusradfar•19m ago
It's true -- I do enjoy this community even though it's failed to serve my every thought with the love that I surely deserve!
wtp1saac•54m ago
It is interesting to see a general bias taken away from the study, which I wouldn't necessarily guess given my own experience. My X "For You" feed mostly does not read pro-Trump - instead mostly pushing very intense pro-European and pro-Canadian economic and political separation from the USA, and pushing very negative narratives of the USA, although I suppose it occasionally also introduces pro-Trump posts, and perhaps those do not sway me in the same way given I am a progressive American.

That said, the Trending tab does tend to push very heavy MAGA-aligned narrative, in a way that to me just seems comical, but I suppose there must be people that genuinely take it at face value, and maybe that does push people.

Less to do with the article:

The more I think about it, I'm not really even sure why I use X these days, other than the fact that I don't really have much of an in-person social life outside of work. Sometimes it can be enjoyable, but honestly the main takeaway I have is that microblogging as a format is genuinely terrible, and X in particular does seem to just feed the most angry things possible. Maybe it's exciting to try and discuss opinions but it is also simultaneously hardly possible to have a nuanced or careful discussion when you have limited characters, and someone on the other end that just wants to shout over you.

I miss being a kid and going onto some forums like for Scratch or Minecraft or whatever. The internet felt way more fun when it was just making cool things and chatting with people about it. I think the USA sort of felt more that way too, but it's hard to know if that was just my privilege. When I write about X, it uncomfortably parallels to how I would consider how my interactions have evolved with my family and friends in real life.

kettlecorn•51m ago
Underrated in X's changes is how blue checkmark users are shown first underneath popular tweets. Most people who pay for blue checkmarks are either sympathetic to Musk's ideology or indifferent. Many blue checkmark users are there to make money from engagement.

The result is underneath any tweet that gets traction you will see countless blue checkmark users either saying something trolling for their side or engagement-baiting.

The people who are more ideologically neutral or not aligned with Musk are completely drowned out below the hundreds of bulk replies of blue checkmarks.

It used to be that if you saw someone, like a tech CEO, take an interesting position you'd have a varied and interesting discussion in the replies. The algorithm would show you replies in particular from people you follow, and often you'd see some productive exchange that actually mattered. Now it's like entirely drivel and you have to scroll through rage bait and engagement slop before getting to the crumbs of meaningful exchange.

It has had a chilling effect on productive intellectual conversation while also accelerating the polarization of the platform by scaring away many people who care about measured conversation.

bool3max•49m ago
I automatically tune out any blue checkmark post or reply and just assume it's an LLM responding to earn $.003
kypro•46m ago
I really wish these points were made in a non-political / platform-specific way because if you care about this issue it's ultimately unhelpful to frame it as if this is an issue with just X or conservatives given how politically divided people are.

I do share the author's concerns and was also concerned back in the day when Twitter was quite literally banning people for posting the wrong opinions there. But it's interesting how the people who used to complain about political bias, now seem to not care, and the people who argued "Twitter is a private company they can do what they want" suddenly think the conservative leaning algorithm now on X is a problem. It's hard to get people across political lines to agree when we do this.

In my opinion there two issues here, neither are politically partisan.

The first is that we humans are flawed and algorithms can use our flaws against us. I've repeatedly spoken about how much I love YouTube's algorithm because despite some people saying it's an echo chamber, I think it's one of the few recommendation algorithms which will serves you a genuinely diverse range of content. But I suspect that's because I genuinely like consuming a very wide range of political content and I know I'm in a minority there (probably because I'm interested in politics as a meta subject, but don't have strong political opinions myself). But my point is these algorithms can work really well if you genuinely want to watch a diverse range of political content.

Secondly some recommendation algorithms (and search algorithms) seem to be genuinely biased which I'd argue isn't a problem itself (they are private companies and can do what they want), but that bias isn't transparent. X very clearly has a conservative bias and Bluesky also very clearly has political bias. Neither would admit their bias so people incorrectly assume they're being served something which is fairly representative of public opinion rather than curated – either by moderation or algorithm tweaks.

What we need is honesty, both from individuals who are themselves seeking out their own bias, and platforms which pretend to not have bias but do, and therefore influence where people believe the center ground is.

We can all be more honest with ourselves. If you exclusively use X or Bluesky, it's worth asking why that is, especially if you're engaging with political content on these platforms. But secondly I think we do need more regulation around the transparency of algorithms. I don't necessary think it's a problem if some platform recommends certain content above other content, or has some algorithm to ban users posts content they don't like, but these decisions should be far transparent than they are today so people are at least able to feed that into how they perceive the neutrality of the content they're consuming.

ChrisArchitect•43m ago
Earlier source: https://www.nature.com/articles/s41586-026-10098-2 (https://news.ycombinator.com/item?id=47064130)
apparent•40m ago
What does it mean to have someone on a chronological feed, versus the algorithmic one? Does that mean a chronological feed of the accounts they follow? I hardly ever use that, since I don't follow many people, and some people I follow post about lots of stuff I don't care about

from the study:

> We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks

jmyeet•38m ago
I blame Google for a lot of this. Why? Because they more than anyone else succedded in spreading the propaganda that "the algorithm" was like some unbiased even all-knowing black box with no human influence whatsoever. They did this for obvious self-serving reasons to defend how Google properties ranked in search results.

But now people seem to think newsfeeds, which increase the influence of "the algorithm", are just a result of engagement and (IMHO) nothing could be further from the truth.

Factually accurate and provable statements get labelled "misinformation" (either by human intervention or by other AI systems ostensibly created to fight misinformation) and thus get lower distribution. All while conspiracy theories get broad distribution.

Even ignoring "misinformation", certain platforms will label some content as "political" and other content as not when a "political" label often comes down to whether or not you agree with it.

One of the most laughable incidents of putting a thumb on the scale was when Grok started complaining about white genocide in South Africa in completely unrelated posts [1].

I predict a coming showdown over Section 230 about all this. Briefly, S230 establishes a distinction between being a publisher (eg a newspaper) and a platform (eg Twitter) and gave broad immunity from prosecution for the platform for user-generated content. This was, at the time (the 1990s), a good thing.

But now we have a third option: social media platforms have become de facto publishers while pretending to be platforms. How? Ranking algorithms, recommendations and newsfeeds.

Think about it this way: imagine you had a million people in an auditorium and you were taking audience questions. What if you only selected questions that were supportive of the government or a particular policy? Are you really a platform? Or are you selecting user questions to pretend something has broad consensus or to push a message compatible with the views of the "platform's" owner?

My stance is that if you, as a platform, actively suppresses and promotoes content based on politics (as IMHO they all do), you are a publisher not a platform in the Section 230 sense.

[1]: https://www.theguardian.com/technology/2025/may/14/elon-musk...

dagelf•35m ago
Theres much more diversity of thought on the right, did they get more open minded?
ortusdux•23m ago
There was a great study from a decade ago that showed that baseball cards held by lighter skinned hands outsold cards held by darker skinned individuals on eBay.

An algorithm designed today with the goal of helping users pick the most profitable product photo would probably steer people towards using caucasian models, and because eBay's cut is a percentage, they would be incentivized to use it.

Studies show that conservatives tend to respond more positively to sponsored content. If this is true, algorithm-driven ad-sponsored social sites will tend towards conservative content.

https://onlinelibrary.wiley.com/doi/abs/10.1111/1756-2171.12...

https://www.tandfonline.com/doi/full/10.1080/00913367.2024.2...