frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Claude Sonnet 4 now supports 1M tokens of context

https://www.anthropic.com/news/1m-context
890•adocomplete•9h ago•492 comments

Search all text in New York City

https://www.alltext.nyc/
63•Kortaggio•1h ago•14 comments

Scapegoating the Algorithm

https://asteriskmag.com/issues/11/scapegoating-the-algorithm
33•fmblwntr•2h ago•16 comments

Ashet Home Computer

https://ashet.computer/
189•todsacerdoti•6h ago•41 comments

Show HN: Building a web search engine from scratch with 3B neural embeddings

https://blog.wilsonl.in/search-engine/
328•wilsonzlin•9h ago•57 comments

Journaling using Nix, Vim and coreutils

https://tangled.sh/@oppi.li/journal
76•icy•11h ago•23 comments

A gentle introduction to anchor positioning

https://webkit.org/blog/17240/a-gentle-introduction-to-anchor-positioning/
41•feross•3h ago•10 comments

Training language models to be warm and empathetic makes them less reliable

https://arxiv.org/abs/2507.21919
206•Cynddl•12h ago•210 comments

Show HN: Omnara – Run Claude Code from anywhere

https://github.com/omnara-ai/omnara
207•kmansm27•9h ago•100 comments

Multimodal WFH setup: flight SIM, EE lab, and music studio in 60sqft/5.5M²

https://www.sdo.group/study
181•brunohaid•3d ago•78 comments

Blender is Native on Windows 11 on Arm

https://www.thurrott.com/music-videos/324346/blender-is-native-on-windows-11-on-arm
115•thunderbong•3d ago•42 comments

AI Eroded Doctors' Ability to Spot Cancer Within Months in Study

https://www.bloomberg.com/news/articles/2025-08-12/ai-eroded-doctors-ability-to-spot-cancer-within-months-in-study
30•zzzeek•58m ago•19 comments

The Missing Protocol: Let Me Know

https://deanebarker.net/tech/blog/let-me-know/
75•deanebarker•5h ago•52 comments

WHY2025: How to become your own ISP [video]

https://media.ccc.de/v/why2025-9-how-to-become-your-own-isp
93•exiguus•8h ago•13 comments

Launch HN: Design Arena (YC S25) – Head-to-head AI benchmark for aesthetics

61•grace77•9h ago•23 comments

LLMs aren't world models

https://yosefk.com/blog/llms-arent-world-models.html
225•ingve•2d ago•115 comments

Go 1.25 Release Notes

https://go.dev/doc/go1.25
111•bitbasher•4h ago•10 comments

Why are there so many rationalist cults?

https://asteriskmag.com/issues/11/why-are-there-so-many-rationalist-cults
383•glenstein•10h ago•584 comments

The Equality Delete Problem in Apache Iceberg

https://blog.dataengineerthings.org/the-equality-delete-problem-in-apache-iceberg-143dd451a974
42•dkgs•7h ago•21 comments

RISC-V single-board computer for less than 40 euros

https://www.heise.de/en/news/RISC-V-single-board-computer-for-less-than-40-euros-10515044.html
126•doener•4d ago•72 comments

Debian GNU/Hurd 2025 released

https://lists.debian.org/debian-hurd/2025/08/msg00038.html
180•jrepinc•3d ago•93 comments

Visualizing quaternions, an explorable video series

https://eater.net/quaternions
3•uncircle•3d ago•0 comments

Dumb to managed switch conversion (2010)

https://spritesmods.com/?art=rtl8366sb&page=1
34•userbinator•3d ago•15 comments

Weave (YC W25) is hiring a founding AI engineer

https://www.ycombinator.com/companies/weave-3/jobs/SqFnIFE-founding-ai-engineer
1•adchurch•8h ago

Fixing a loud PSU fan without dying

https://chameth.com/fixing-a-loud-psu-fan-without-dying/
14•sprawl_•3d ago•15 comments

Galileo’s telescopes: Seeing is believing (2010)

https://www.historytoday.com/archive/history-matters/galileos-telescopes-seeing-believing
14•hhs•3d ago•4 comments

Nexus: An Open-Source AI Router for Governance, Control and Observability

https://nexusrouter.com/blog/introducing-nexus-the-open-source-ai-router
81•mitchwainer•11h ago•21 comments

Australian court finds Apple, Google guilty of being anticompetitive

https://www.ghacks.net/2025/08/12/australian-court-finds-apple-google-guilty-of-being-anticompetitive/
322•warrenm•12h ago•119 comments

How to safely escape JSON inside HTML SCRIPT elements

https://sirre.al/2025/08/06/safe-json-in-script-tags-how-not-to-break-a-site/
69•dmsnell•4d ago•40 comments

Comparing baseball greats across eras, who comes out on top?

https://phys.org/news/2025-07-baseball-greats-eras.html
6•PaulHoule•2d ago•13 comments
Open in hackernews

Why are there so many rationalist cults?

https://asteriskmag.com/issues/11/why-are-there-so-many-rationalist-cults
383•glenstein•10h ago

Comments

api•10h ago
Why are there so many cults? People want to feel like they belong to something, and in a world in the midst of a loneliness and isolation epidemic the market conditions are ideal for cults.
iwontberude•10h ago
Your profile says that you want to keep your identity small, but you have like over 30 thousand comments spelling out exactly who you are and how you think. Why not shard accounts? Anyways. Just a random thought.
keybored•10h ago
[deleted]
shadowgovt•10h ago
"SC identity?"
mindslight•10h ago
Also, who would want to join an "irrationalist cult" ?
shadowgovt•10h ago
Hey now, the Discordians have an ancient and respectable tradition. ;)
NoGravitas•9h ago
Five tons of flax!
FuriouslyAdrift•10h ago
Because we are currently living in an age of narcissism and tribalism / Identitarianism is the societal version of narcissism.
khazhoux•7h ago
> Because we are currently living in an age of narcissism and tribalism

I've been saying this since at least 1200 BC!

shadowgovt•10h ago
The book Imagined Communities (Benedict Anderson) touches on this, making the case that in modern times, "nation" has replaced the cultural narrative purpose previously held by "tribe," "village," "royal subject," or "religion."

The shared thread among these is (in ever widening circles) a story people tell themselves to justify precisely why, for example, the actions of someone you'll never meet in Tulsa, OK have any bearing whatsoever on the fate of you, a person in Lincoln, NE.

One can see how this leaves an individual in a tenuous place if one doesn't feel particularly connected to nationhood (one can also see how being too connected to nationhood, in an exclusionary way, can also have deleterious consequences, and how not unlike differing forms of Christianity, differing concepts on what the 'soul' of a nation is can foment internal strife).

(To be clear: those fates are intertwined to some extent; the world we live in grows ever smaller due to the power of up-scaled influence of action granted by technology. But "nation" is a sort of fiction we tell ourselves to fit all that complexity into the slippery meat between human ears).

ameliaquining•9h ago
The question the article is asking is "why did so many cults come out of this particular social milieu", not "why are there a lot of cults in the whole world".
optimalsolver•10h ago
Pertinent Twitter comment:

"Rationalism is such an insane name for a school of thought. Like calling your ideology correctism or winsargumentism"

https://x.com/growing_daniel/status/1893554844725616666

nyeah•10h ago
Great names! Are you using them, or are they available? /s
wiredfool•10h ago
Objectivisim?
hn_throwaway_99•10h ago
To be honest I don't understand that objection. If you strip it from all its culty sociological effects, one of the original ideas of rationalism was to try to use logical reasoning and statistical techniques to explicitly avoid the pitfalls of known cognitive biases. Given that foundational tenet, "rationalism" seems like an extremely appropriate moniker.

I fully accept that the rationalist community may have morphed into something far beyond that original tenet, but I think rationalism just describes the approach, not that it's the "one true philosophy".

ameliaquining•10h ago
That it refers to a different but confusingly related concept in philosophy is a real downside of the name.
nyeah•9h ago
I'm going to start a group called "Mentally Healthy People". We use data, logical thinking, and informal peer review. If you disagree with us, our first question will be "what's wrong with mental health?"
handoflixue•9h ago
So... Psychiatry? Do you think psychiatrists are particularly prone to starting cults? Do you think learning about psychiatry makes you at risk for cult-like behavior?
nyeah•9h ago
No. I have no beef with psychology or psychiatry. They're doing good work as far as I can tell. I am poking fun at people who take "rationality" and turn it into a brand name.
handoflixue•9h ago
Why is "you can work to avoid cognitive biases" more ridiculous than "you can work to improve your mental health"?
nyeah•9h ago
I'm feeling a little frustrated by the derail. My complaint is about some small group claiming to have a monopoly on a normal human faculty, in this case rationality. The small group might well go on to claim that people outside the group lack rationality. That would be absurd. The mental health profession do not claim to be immune from mental illness themselves, they do not claim that people outside their circle are mentally unhealthy, and they do not claim that their particular treatment is necessary for mental health.

I guess it's possible you might be doing some deep ironic thing by providing a seemingly sincere example of what I'm complaining about. If so it was over my head but in that case I withdraw "derail"!

glenstein•9h ago
Right and to your point, I would say you can distinguish (1) "objective" in the sense of relying on mind-independent data from (2) absolute knowledge, which treats subjects like closed conversations. And you can make similar caveats for "rational".

You can be rational and objective about a given topic without it meaning that the conversation is closed, or that all knowledge has been found. So I'm certainly not a fan of cult dynamics, but I think it's easy to throw an unfair charge at these groups, that their interest in the topic necessitates an absolutist disposition.

ameliaquining•10h ago
IIUC the name in its current sense was sort of an accident. Yudkowsky originally used the term to mean "someone who succeeds at thinking and acting rationally" (so "correctism" or "winsargumentism" would have been about equally good), and then talked about the idea of "aspiring rationalists" as a community narrowly focused on developing a sort of engineering discipline that would study the scientific principles of how to be right in full generality and put them into practice. Then the community grew and mutated into a broader social milieu that was only sort of about that, and people needed a name for it, and "rationalists" was already there, so that became the name through common usage. It definitely has certain awkwardnesses.
SilverElfin•5h ago
What do you make of the word “science” then?
mlinhares•10h ago
> One is Black Lotus, a Burning Man camp led by alleged rapist Brent Dill, which developed a metaphysical system based on the tabletop roleplaying game Mage the Ascension.

What the actual f. This is such an insane thing to read and understand what it means that i might need to go and sit in silence for the rest of the day.

How did we get to this place with people going completely nuts like this?

optimalsolver•10h ago
astronauts_meme.jpg
linohh•10h ago
Running a cult is a somewhat reliable source of narcissistic supply. The internet tells you how to do it. So an increasing number of people do it.
TimorousBestie•10h ago
Mage: The Ascension is basically a delusions of grandeur simulator, so I can see how an already unstable personality might get attached to it and become more unstable.
mlinhares•10h ago
I don't know, i'd understand something like Wraith (which I did see people developing issues, the shadow mechanic is such a terrible thing) but Mage is so, like, straightforward?

Use your mind to control reality, reality fights back with paradox, its cool for a teenager but you read a bit more fantasy and you'll definitely find cooler stuff. But i guess for you to join a cult your mind must stay a teen mind forever.

wavefunction•10h ago
How many adults actually abandon juvenalia as they age? Not the majority in my observation, and it's not always a bad thing when it's only applied to subjects like pop culture. Applied juvenalia in response to serious subjects is a more serious issue.
mlinhares•10h ago
There has to be a cult of people that believe they’re vampires, respecting the masquerade and serving some antedeluvian somewhere, vampire was much more mainstream than mage.
DonHopkins•10h ago
There are post-pubescent males who haven't abandoned Atlas Shrugged posting to this very web site!
WJW•10h ago
I didn't originally write this, but can't find the original place I read it anymore. I think it makes a lot of sense to repost it here:

All of the World Of Darkness and Chronicles Of Darkness games are basically about coming of age/puberty. Like X-Men but for Goth-Nerds instead of Geek-Nerds.

In Vampire, your body is going through weird changes and you are starting to develop, physically and/or mentally, while realising that the world is run by a bunch of old, evil fools who still expect you to toe the line and stay in your place, but you are starting to wonder if the world wouldn't be better if your generation overthrew them and took over running the world, doing it the right way. And there are all these bad elements trying to convince you that you should do just that, but for the sake of mindless violence and raucous partying. Teenager - the rpg.

In Werewolf, your body is going through weird changes and you are starting to develop, physically and mentally, while realising that you are not a part of the "normal" crowd that the rest of Humanity belongs to. You are different and they just can't handle that whenever it gets revealed. Luckily, there are small communities of people like you out there who take you in and show you how use the power of your "true" self. Of course, even among this community, there are different types of other. LGBT Teenager - the RPG

In Mage, you have begun to take an interest in the real world, and you think you know what the world is really like. The people all around you are just sleep-walking through life, because they don't really get it. This understanding sets you against the people who run the world: the governments and the corporations, trying to stop these sleeper from waking up to the truth and rejecting their comforting lies. You have found some other people who saw through them, and you think they've got a lot of things wrong, but at least they're awake to the lies! Rebellious Teenager - the RPG

reactordev•10h ago
I think I read it too, it’s called Twilight. /s

I had friends who were into Vampire growing up. I hadn’t heard of Werewolf until after the aforementioned book came out and people started going nuts for it. I mentioned to my wife at the time that there was this game called “Vampire” and told her about it and she just laughed, pointed to the book, and said “this is so much better”. :shrug:

Rewind back and there were the Star Wars kids. Fast forward and there are the Harry Potter kids/adults. Each generation has their own “thing”. During that time, it was Quake MSDOS and Vampire. Oh and we started Senior Assassinations. 90s super soakers were the real deal.

abullinan•10h ago
“ The people all around you are just sleep-walking through life, because they don't really get it.”

Twist: we’re sleepwalking through life because we really DO get it.

(Source: I’m 56)

mlinhares•9h ago
This tracks, but I'd say Werewolf goes beyond LGBT folks, the violence there also fits the boy's aggressive play and the saving the world theme resonated a lot with the basic "i want to be important/hero" thing. Its my favorite of all world of darkness books, i regret not getting the kickstarter edition :(
michaeldoron•7h ago
Yeah, I would say Werewolf is more like Social Activist: The Rage simulator than LGBT teenager
JTbane•10h ago
I don't know how you can call yourself a "rationalist" and base your worldview on a fantasy game.
reactordev•10h ago
Rationalizing the fantasy. Like LARPing. Only you lack weapons, armor, magic missiles…
hungryhobbit•10h ago
Mage is an interesting game though: it's fantasy, but not "swords and dragons" fantasy. It's set in the real world, and the "magic" is just the "mage" shifting probabilities so that unlikely (but possible) things occur.

Such a setting would seem like the perfect backdrop for a cult that claims "we have the power to subtly influence reality and make improbable things (ie. "magic") occur".

empath75•9h ago
Most "rationalists" throughout history have been very deeply religious people. Secular enlightenment-era rationalism is not the only direction you can go with it. It depends very much, as others have said, on what your axioms are.

But, fwiw, that particular role-playing game was very much based on trendy at the time occult beliefs in things like chaos magic, so it's not completely off the wall.

vannevar•9h ago
"Rationalist" in this context does not mean "rational person," but rather "person who rationalizes."
ponector•9h ago
I my experience, religious people are perfectly fine with contradicted worldview.

Like christians are very flexible in following 10 commandments, always been.

BalinKing•8h ago
That example isn’t a contradictory worldview though, just “people being people, and therefore failing to be as good as the ideal they claim to strive for.”
scns•4h ago
Being fine with cognitive dissonance would be a prerequisite for holding religious beliefs i'd say.
prepend•7h ago
I mean, is it a really good game?

I’ve never played, but now I’m kind of interesting.

nemomarx•5h ago
It's reportedly alright - the resolution mechanic seems a little fiddly with varying pools of dice for everything. The lore is pretty interesting though and I think a lot of the point of that series of games was reading up on that.
SirFatty•10h ago
Came to ask a similar question, but also has it always been like this? The difference is now these people/groups on the fringe had no visibility before the internet?

It's nuts.

lazide•10h ago
Have you heard of Heavens Gate? [https://en.m.wikipedia.org/wiki/Heaven%27s_Gate_(religious_g...].

There are at least a dozen I can think of, including the ‘drink the koolaid’ Jonestown massacre.

People be crazy, yo.

SirFatty•10h ago
Of course, Jim Jones and L Ron Hubbard, David Kersh. I realize there have always been people that are coocoo for cocoa puffs. But so many as there appear to be now?
tuesdaynight•9h ago
Internet made possible to know global news all the time. I think that there have always been a lot of people with very crazy and extremist views, but we only knew about the ones closer to us. Now it's possible to know about crazy people from the other side of the planet, so it looks like there's a lot more of them than before.
lazide•9h ago
Yup. Like previously, westerners could have gone their whole lives with no clue the Hindutva existed [https://en.m.wikipedia.org/wiki/Hindutva] - Hindu Nazis, basically. Which if you know Hinduism at all, is a bit like saying Buddhist Nazis. Say what?

Which actually kinda exised/exists too? [https://en.m.wikipedia.org/wiki/Nichirenism], right down to an attempted coup and a bunch of assassinations [https://en.m.wikipedia.org/wiki/League_of_Blood_Incident].

Now you know. People be whack.

geoffeg•9h ago
Just a note that the Heaven's Gate website is still up. It's a wonderful snapshot of 90s web design. https://www.heavensgate.com/
ants_everywhere•8h ago
what a wild set of SEO keywords

> Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate Heaven's Gate ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo ufo space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien space alien extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial extraterrestrial misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation misinformation freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom freedom second coming second coming second coming second coming second coming second coming second coming second coming second coming second coming angels angels angels angels angels angels angels angels angels angels end end times times end times end times end times end times end times end times end times end times end times Key Words: (for search engines) 144,000, Abductees, Agnostic, Alien, Allah, Alternative, Angels, Antichrist, Apocalypse, Armageddon, Ascension, Atheist, Awakening, Away Team, Beyond Human, Blasphemy, Boddhisattva, Book of Revelation, Buddha, Channeling, Children of God, Christ, Christ's Teachings, Consciousness, Contactees, Corruption, Creation, Death, Discarnate, Discarnates, Disciple, Disciples, Disinformation, Dying, Ecumenical, End of the Age, End of the World, Eternal Life, Eunuch, Evolution, Evolutionary, Extraterrestrial, Freedom, Fulfilling Prophecy, Genderless, Glorified Body, God, God's Children, God's Chosen, God's Heaven, God's Laws, God's Son, Guru, Harvest Time, He's Back, Heaven, Heaven's Gate, Heavenly Kingdom, Higher Consciousness, His Church, Human Metamorphosis, Human Spirit, Implant, Incarnation, Interfaith, Jesus, Jesus' Return, Jesus' Teaching, Kingdom of God, Kingdom of Heaven, Krishna Consciousness, Lamb of God, Last Days, Level Above Human, Life After Death, Luciferian, Luciferians, Meditation, Members of the Next Level, Messiah, Metamorphosis, Metaphysical, Millennium, Misinformation, Mothership, Mystic, Next Level, Non Perishable, Non Temporal, Older Member, Our Lords Return, Out of Body Experience, Overcomers, Overcoming, Past Lives, Prophecy, Prophecy Fulfillment, Rapture, Reactive Mind, Recycling the Planet, Reincarnation, Religion, Resurrection, Revelations, Saved, Second Coming, Soul, Space Alien, Spacecraft, Spirit, Spirit Filled, Spirit Guide, Spiritual, Spiritual Awakening, Star People, Super Natural, Telepathy, The Remnant, The Two, Theosophy, Ti and Do, Truth, Two Witnesses, UFO, Virginity, Walk-ins, Yahweh, Yeshua, Yoda, Yoga,

lazide•5h ago
It’s the aliens to yoga ratio that really gets me. Yogis got really shortchanged here.
jameslk•4h ago
I was curious who's keeping that website alive, and allegedly it's two former members of the cult: Mark and Sarah King

https://www.vice.com/en/article/a-suicide-cults-surviving-me...

reactordev•10h ago
It’s always been like this, have you read the Bible? Or the Koran? It’s insane. Ours is just our flavor of crazy. Every generation has some. When you dig at it, there’s always a religion.
mlinhares•10h ago
Mage is a game for teenagers, it doesn't try to be anything else other than a game where you roll dice do to stuff.
reactordev•10h ago
Mage yea, but the cult? Where do you roll for crazy? Is it a save against perception? Constitution? Or intelligence check?

I know the church of Scientology wants you to crit that roll of tithing.

mlinhares•9h ago
> I know the church of Scientology wants you to crit that roll of tithing.

I shouldn't LOL at this but I must. We're all gonna die in these terrible times but at least we'll LOL at the madness and stupidity of it all.

reactordev•9h ago
Like all tragedies, there’s comedy there somewhere. Sometimes you have to be it.
zzzeek•8h ago
yeah, people should understand, what is Scientology based on? The E-Meter which is some kind of cheap shit radio shack lie detector thing. I'm quite sure LLMs would do very well if given the task to spit out new cult doctrines and I would gather we are less than years away from cults based on LLM generated content (if not already).
bitwize•8h ago
Terry Davis, a cult of one, believed God spoke to him through his computer's RNG. So... yeah.
reactordev•7h ago
If only he installed Dwarf Fortress where he could become one.
notahacker•6h ago
tbf Helter Skelter was a song about a fairground ride that didn't really pretend to be much more than an excuse for Paul McCartney to write something loud, but that didn't stop a sufficiently manipulative individual turning it into a reason why the Family should murder people. And he didn't even need the internet to help him find followers.
startupsfail•10h ago
It is used ti be always religion. But now downsides are well understood. And alternatives that can fill the same need (social activities) - like gathering with your neighbors, singing, performing arts, clubs, parks and paries are available and great.
reactordev•9h ago
I can see that. There’s definitely a reason they keep pumping out Call of Duty’s and Madden’s.
Mountain_Skies•8h ago
Religions have multitudes of problems but suicide rates amongst atheists is higher than you'd think it would be. It seems like for many, rejection of organized religion leads to adoption of ad hoc quasi-religions with no mooring to them, leaving the person who is seeking a solid belief system drifting until they find a cult, give in to madness that causes self-harm, or adopt their own system of belief that they then need to vigorously protect from other beliefs.

Some percentage of the population has a lesser need for a belief system (supernatural, ad hoc, or anything else) but in general, most humans appear to be hardcoded for this need and the overlap doesn't align strictly with atheism. For the atheist with a deep need for something to believe in, the results can be ugly. Though far from perfect, organized religions tend to weed out their most destructive beliefs or end up getting squashed by adherents of other belief systems that are less internally destructive.

reactordev•6h ago
Nothing to do with religion and everything to do with support networks that Churches and those Groups provide. Synagogue, Church, Camp, Retreat, a place of belonging.

Atheists tend to not have those consistently and must build their own.

saghm•9h ago
Without speaking for religions I'm not familiar with, I grew up Catholic, and one of the most important Catholic beliefs is that during Mass, the bread (i.e. "communion" wafers) and wine quite literally transform into the body and blood of Jesus, and that eating and drinking it is a necessary ritual to get into heaven[1], which was a source of controversy even back as far as the Protestant Reformation, with some sects retaining that doctrine and others abandoning it. In a lot of ways, what's considered "normal" or "crazy" in a religion comes to what you're familiar with.

For those not familiar with the bible enough to know what to look for to find the wild stuff, look up the story of Elisha summoning bears out of the first to maul children for calling him bald, or the last two chapters of Daniel (which I think are only in the Catholic bible) where he literally blows up a dragon by feeding it a cake.

[1]: https://en.wikipedia.org/wiki/Real_presence_of_Christ_in_the...

robertlagrant•9h ago
Yes, Catholicism has definitely accumulated some cruft :)
tialaramex•9h ago
Yeah "Transubstantiation" is another technical term people might want to look at in this topic. The art piece "An Oak Tree" is a comment on these ideas. It's a glass of water. But, the artist's notes for this work insist it is an oak tree.
petralithic•7h ago
Someone else who knows "An Oak Tree"! It is one of my favorite pieces because it wants not reality itself to be the primary way to see the world, but the belief of what reality could be.
scns•4h ago
Interesting you bring art into the discussion. Often thought that some "artists" have a lot in common with cult leaders. My definition of art would be that is immediately understood, zero explanation needed.
tialaramex•2h ago
I definitely can't get behind that definition. The one I've used for a good while is: The unnecessary done on purpose.

Take Barbara Hepworth's "Two Figures" a sculpture which is just sat there on the campus where I studied for many years (and where I also happen to work today). What's going on there? I'm not sure.

Sculpture of ideals I get. Liberty, stood on her island, Justice (with or without her blindfold, but always carrying not only the scales but also a sword†). I used to spend a lot of time in the hall where "The Meeting Place" is. They're not specific people, they're an idea, they're the spirit of the purpose of this place (a railway station, in fact a major international terminus). That's immediately understood, yeah.

But I did not receive an immediate understanding of "Two figures". It's an interesting piece. I still occasionally stop and look at it as I walk across the campus, but I couldn't summarise it in a sentence even now.

† when you look at that cartoon of the GOP operatives with their hands over Justice's mouth, keep in mind that out of shot she has a sword. Nobody gets out of here alive.

o11c•8h ago
The "bears" story reads a lot more sensibly if you translated it correctly as "a gang of thugs tries to bully Elisha into killing himself." Still reliant on the supernatural, but what do you expect from such a book?
michaeldoron•7h ago
Where do you see that in the text? I am looking at the Hebrew script, and the text only reads that as Elisha went up a path, young lads left the city and mocked him by saying "get up baldy", and he turned to them and cursed them to be killed by two she bears. I don't think saying "get up baldy" to a guy walking up a hill constitutes bullying him into killing himself.
reactordev•7h ago
Never underestimate the power of words. Kids have unalived themselves over it.

I think the true meaning has been lost to time. The Hebrew text has been translated and rewritten so many times it’s a children’s book. The original texts of the Dead Sea scrolls are bits and pieces of that long lost story. All we have left are the transliterations of transliterations.

o11c•4h ago
It's called context. The beginning of the chapter is Elijah (Elisha's master) being removed from Earth and going up (using the exact same Hebrew word) to Heaven. Considering that the thugs are clearly not pious people, "remove yourself from the world, like your master did" has only one viable interpretation.

As for my choice of the word "thugs" ("mob" would be another good word), that is necessary to preserve the connotation. Remember, there were 42 of them punished, possibly more escaped - this is a threatening crowd size (remember the duck/horse meme?). Their claimed youth does imply "not an established veteran of the major annual wars", but that's not the same as "not acquainted with violence".

cjameskeller•8h ago
To be fair, the description of the dragon incident is pretty mundane, and all he does is prove that the large reptile they had previously been feeding (& worshiping) could be killed:

"Then Daniel took pitch, and fat, and hair, and did seethe them together, and made lumps thereof: this he put in the dragon's mouth, and so the dragon burst in sunder: and Daniel said, Lo, these are the gods ye worship."

saghm•6h ago
I don't think it's mundane to cause something to "burst in sunder" by putting some pitch, fat, and hair in its mouth.
neaden•4h ago
The story is pretty clearly meant to indicate that the Babylonians were worshiping an animal though. The theology of the book of Daniel emphasises that the Gods of the Babylonians don't exist, this story happens around the same time Daniel proves the priests had a secret passage they were using to get the food offered to Bel and eat it at night while pretending that Bel was eating it. Or when Daniel talks to King Belshazzar and says "You have praised the gods of silver and gold, of bronze, iron, wood, and stone, which do not see or hear or know, but the God in whose power is your very breath and to whom belong all your ways, you have not honored". This is not to argue for the historical accuracy of the stories, just that the point is that Daniel is acting as a debunker of the Babylonian beliefs in these stories while asserting the supremacy of the Israelite beliefs.
genghisjahn•7h ago
I've recently started attending an Episcopal church. We have some people who lean heavy on transubstantiation, but our priest says, "look, something holy happens during communion, exactly what, we don't know."

See also: https://www.episcopalchurch.org/glossary/real-presence/?

"Belief in the real presence does not imply a claim to know how Christ is present in the eucharistic elements. Belief in the real presence does not imply belief that the consecrated eucharistic elements cease to be bread and wine."

reactordev•7h ago
Same could be said for bowel movements too though.

There’s a fine line between suspension of disbelief and righteousness. All it takes is for one to believe their own delusion.

glenstein•10h ago
I personally (for better or worse) became familiar with Ayn Rand as a teenager, and I think Objectivism as a kind of extended Ayn Rand social circle and set of organizations has faced the charge of cultish-ness, and that dates back to, I want to say, the 70s and 80s at least. I know Rand wrote much earlier than that, but I think the social and organizational dynamics unfolded rather late in her career.
hexis•9h ago
Albert Ellis wrote a book, "Is Objectivism a Religion" as far back as 1968. Murray Rothbard wrote "Mozart Was a Red", a play satirizing Rand's circle, in the early 60's. Ayn Rand was calling her own circle of friends, in "jest", "The Collective" in the 50's. The dynamics were there from almost the beginning.
ryandrake•9h ago
“There are two novels that can change a bookish fourteen-year old’s life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs."

https://www.goodreads.com/quotes/366635-there-are-two-novels...

cogman10•9h ago
I think it's pretty similar dynamics. It's unquestioned premises (dogma) which are supposed to be accepted simply because this is "objectivism" or "rationalism".

Very similar to my childhood religion. "We have figured everything out and everyone else is wrong for not figuring things out".

Rationalism seems like a giant castle built on sand. They just keep accruing premises without ever going backwards to see if those premises make sense. A good example of this is their notions of "information hazards".

afpx•8h ago
Her books were very popular with the gifted kids I hung out with in the late 80s. Cool kids would carry around hardback copies of Atlas Shrugged, impressive by the sheer physical size and art deco cover. How did that trend begin?
prepend•7h ago
People reading the book and being into it and telling other people.

It’s also a hard book to read so it may be smart kids trying to signal being smart.

jacquesm•7h ago
The only thing that makes it hard to read is the incessant soap-boxing by random characters. I have a rule that if I start a book I finish it but that one had me tempted.
mikestew•5h ago
I’m convinced that even Rand’s editor didn’t finish the book. That is why Galt’s soliloquy is ninety friggin’ pages long. (When in reality, three minutes in and people would be unplugging their radios.)
meheleventyone•7h ago
It’s hard to read because it’s tedious not because you need to be smart though.
notahacker•5h ago
tbf you have to have read it to know that!

I can't help but think it's probably the "favourite book" of a lot of people who haven't finished it though, possibly to a greater extent than any other secular tome (at least LOTR's lightweight fans watched the movies!).

I mean, if you've only read the blurb on the back it's the perfect book to signal your belief in free markets, conservative values and the American Dream: what could be more a more strident defence of your views than a book about capitalists going on strike to prove how much the world really needs them?! If you read the first few pages, it's satisfyingly pro-industry and contemptuous of liberal archetypes. If you trudge through the whole thing, it's not only tedious and odd but contains whole subplots devoted to dumping on core conservative values (religion bad, military bad, marriage vows not that important really, and a rather jaded take on actually extant capitalism) in between the philosopher pirates and jarring absence of private transport, and the resolution is an odd combination of a handful of geniuses running away to form a commune and the world being saved by a multi-hour speech about philosophy which has surprisingly little to say on market economics...

mikestew•5h ago
at least LOTR's lightweight fans watched the movies!

Oh, there’s movies for lazy Rand fans, too.

https://www.imdb.com/title/tt0480239/

More of a Fountainhead fan, are you? Do ya like Gary Cooper and Patricia Neal?

https://www.imdb.com/title/tt0041386/?ref_=ext_shr_lnk

notahacker•4h ago
> Oh, there’s movies for lazy Rand fans, too.

tbf that comment was about 50% a joke about their poor performance at the box office :D

mikestew•4h ago
Rereading your comment, that’s my woosh moment for the day, I guess. :-)

Though a Gary Cooper The Fountainhead does tempt me on occasion. (Unlike Atlas Shrugged, The Fountainhead wasn’t horrible, but still some pretty poor writing.)

CalChris•4h ago
Fountainhead is written at the 7th grade reading level. Its Lexile level is 780L. It's long and that's about it. By comparison, 1984 is 1090L.
jacquesm•7h ago
By setting up the misfits in a revenge of the nerds scenario?

Ira Levin did a much better job of it and showed what it would lead to but his 'This Perfect Day' did not - predictably - get the same kind of reception as Atlas Shrugged did.

spacechild1•7h ago
What's funny is that Robert Anton Wilson and Robert Shear already took the piss out of Ayn Rand in Illuminatus! (1969-1971).
rglover•9h ago
> Came to ask a similar question, but also has it always been like this?

Crazy people have always existed (especially cults), but I'd argue recruitment numbers are through the roof thanks to technology and a failing economic environment (instability makes people rationalize crazy behavior).

It's not that those groups didn't have visibility before, it's just easier for the people who share the same...interests...to cloister together on an international scale.

jacquesm•7h ago
It's no more crazy than a virgin conception. And yet, here we are. A good chunk of the planet believes that drivel, but they'd throw their own daughters out of the house if they made the same claim.
davorak•10h ago
Makes me think of that saying that great artists steal, so repurposed for cult founders: "Good cult founders copy, great cult founders steal"

I do not think this cult dogma is any more out there than other cult dogma I have heard, but the above quote makes me think it is easier to found cults in modern day in someways since you can steal other complex world building from numerous sources rather building yourself and keeping everything straight.

TrackerFF•10h ago
Cult leaders tend to be narcissists.

Narcissists tend to believe that they are always right, no mater what the topic is, or how knowledgeable they are. This makes them speak with confidence and conviction.

Some people are very drawn to confident people.

If the cult leader has other mental health issues, it can/will seep into their rhetoric. Combine that with unwavering support from loyal followers that will take everything they say as gospel...

That's about it.

TheOtherHobbes•9h ago
That's pretty much it. The beliefs are just a cover story.

Outside of those, the cult dynamics are cut-paste, and always involve an entitled narcissistic cult leader acquiring as much attention/praise, sex, money, and power as possible from the abuse and exploitation of followers.

Most religion works like this. Most alternative spirituality works like this Most finance works like this. Most corporate culture works like this. Most politics works like this.

Most science works like this. (It shouldn't, but the number of abused and exploited PhD students and post-docs is very much not zero.)

The only variables are the differing proportions of attention/praise, sex, money, and power available to leaders, and the amount of abuse that can be delivered to those lower down and/or outside the hierarchy.

The hierarchy and the realities of exploitation and abuse are a constant.

If you removed this dynamic from contemporary culture there wouldn't be a lot left.

Fortunately quite a lot of good things happen in spite of it. But a lot more would happen if it wasn't foundational.

vannevar•9h ago
Yes. The cult's "beliefs" really boil down to one belief: the infallibility of the leader. Much of the attraction is in the simplicity.
patrickmay•7h ago
If what you say is true, we're very lucky no one like that with a massive following has ever gotten into politics in the United States. It would be an ongoing disaster!
piva00•10h ago
I've met a fair share of people in the burner community, the vast majority I met are lovely folks who really enjoy the process of bringing some weird big idea into reality, working hard on the builds, learning stuff, and having a good time with others for months to showcase their creations at some event.

On the other hand, there's a whole other side of a few nutjobs who really behave like cult leaders, they believe their own bullshit and over time manage to find in this community a lot of "followers", since one of the foundational aspects is radical acceptance it becomes very easy to be nutty and not questioned (unless you do something egregious).

greenavocado•10h ago
Humans are compelled to find agency and narrative in chaos. Evolution favored those who assumed the rustle was a predator, not the wind. In a post-Enlightenment world where traditional religion often fails (or is rejected), this drive doesn't vanish. We don't stop seeking meaning. We seek new frameworks. Our survival depended on group cohesion. Ostracism meant death. Cults exploit this primal terror. Burning Man's temporary city intensifies this: extreme environment, sensory overload, forced vulnerability. A camp like Black Lotus offers immediate, intense belonging. A tribe with shared secrets (the "Ascension" framework), rituals, and an "us vs. the sleepers" mentality. This isn't just social; it's neurochemical. Oxytocin (bonding) and cortisol (stress from the environment) flood the system, creating powerful, addictive bonds that override critical thought.

Human brains are lazy Bayesian engines. In uncertainty, we grasp for simple, all-explaining models (heuristics). Mage provides this: a complete ontology where magic equals psychology/quantum woo, reality is malleable, and the camp leaders are the enlightened "tradition." This offers relief from the exhausting ambiguity of real life. Dill didn't invent this; he plugged into the ancient human craving for a map that makes the world feel navigable and controllable. The "rationalist" veneer is pure camouflage. It feels like critical thinking but is actually pseudo-intellectual cargo culting. This isn't Burning Man's fault. It's the latest step of a 2,500-year-old playbook. The Gnostics and the Hermeticists provided ancient frameworks where secret knowledge ("gnosis") granted power over reality, accessible only through a guru. Mage directly borrows from this lineage (The Technocracy, The Traditions). Dill positioned himself as the modern "Ascended Master" dispensing this gnosis.

The 20th century cults Synanon, EST, Moonies, NXIVM all followed similar patterns, starting with isolation. Burning Man's temporary city is the perfect isolation chamber. It's physically remote, temporally bounded (a "liminal space"), fostering dependence on the camp. Initial overwhelming acceptance and belonging (the "Burning Man hug"), then slowly increasing demands (time, money, emotional disclosure, sexual access), framed as "spiritual growth" or "breaking through barriers" (directly lifted from Mage's "Paradigm Shifts" and "Quintessence"). Control language ("sleeper," "muggle," "Awakened"), redefining reality ("that rape wasn't really rape, it was a necessary 'Paradox' to break your illusions"), demanding confession of "sins" (past traumas, doubts), creating dependency on the leader for "truth."

Burning Man attracts people seeking transformation, often carrying unresolved pain. Cults prey on this vulnerability. Dill allegedly targeted individuals with trauma histories. Trauma creates cognitive dissonance and a desperate need for resolution. The cult's narrative (Mage's framework + Dill's interpretation) offers a simple explanation for their pain ("you're unAwakened," "you have Paradox blocking you") and a path out ("submit to me, undergo these rituals"). This isn't therapy; it's trauma bonding weaponized. The alleged rape wasn't an aberration; it was likely part of the control mechanism. It's a "shock" to induce dependency and reframe the victim's reality ("this pain is necessary enlightenment"). People are adrift in ontological insecurity (fear about the fundamental nature of reality and self). Mage offers a new grand narrative with clear heroes (Awakened), villains (sleepers, Technocracy), and a path (Ascension).

photios•6h ago
Gnosticism... generating dumb cults that seem smart on the outside for 2+ thousand years. Likely to keep it up for 2k more.
gedy•10h ago
Paraphrasing someone I don't recall - when people believe in nothing, they'll believe anything.
collingreen•9h ago
And therefore you should believe in me and my low low 10% tithe! That's the only way to not get tricked into believing something wrong so don't delay!
gedy•1h ago
That's not an endorsement of a particular religion.
lmm•22m ago
It is though. In practice it's always used to promote a particular religion.
pstuart•10h ago
People are wired to worship, and want somebody in charge telling them what to do.

I'm a staunch atheist and I feel the pull all the time.

hn_acc1•40m ago
I slowly deconverted from being raised evangelical / fundamentalist into being an atheist in my late 40s. I still "pray" at times just to (mentally) shout my frustration at the sorry state of the world at SOMETHING (even nothing) rather than constantly yelling my frustration at my family.

I may have actually been less anxious about the state of the world back then, and may have remained so, if I'd just continued to ignore all those little contradictions that I just couldn't ignore anymore...... But I feel SO MUCH less personal guilt about being "human".

Nihilartikel•10h ago
I'm entertaining sending my kiddo to a Waldorf School, because it genuinely seems pretty good.

But looking into the underlying Western Esoteric Spirit Science, 'Anthroposophy' (because Theosophy wouldn't let him get weird enough) by Rudolph Steiner, has been quite a ride. The point being that.. humans have a pretty endless capacity to go ALL IN on REALLY WEIRD shit, as long as it promises to fix their lives if they do everything they're told. Naturally if their lives aren't fixed, then they did it wrong or have karmic debt to pay down, so YMMV.

In any case, I'm considering the latent woo-cult atmosphere as a test of the skeptical inoculation that I've tried to raise my child with.

BryantD•9h ago
I went to a Waldorf school and I’d recommend being really wary. The woo is sort of background noise, and if you’ve raised your kid well they’ll be fine. But the quality of the academics may not be good at all. For example, when I was ready for calculus my school didn’t have anyone who knew how to teach it so they stuck me and the other bright kid in a classroom with a textbook and told us to figure it out. As a side effect of not being challenged, I didn’t have good study habits going into college, which hurt me a lot.

If you’re talking about grade school, interview whoever is gonna be your kids teacher for the next X years and make sure they seem sane. If you’re talking about high school, give a really critical look at the class schedule.

Waldorf schools can vary a lot in this regard so you may not encounter the same problems I did, but it’s good to be cautious.

linohh•6h ago
Don't do it. It's a place that enables child abuse with its culture. These people are serious wackos and you should not give your kid into their hands. A lot of people come out of that Steiner Shitbox traumatized for decades if not for life. They should not be allowed to run schools to begin with. Checking a lot of boxes from antivax to whatever the fuck their lore has to offer starting with a z.
namuol•10h ago
> How did we get to this place with people going completely nuts like this?

Ayahuasca?

yamazakiwi•5h ago
Nah I did Ayahuasca and I'm an empathetic person who most would consider normal or at least well-adjusted. If it's drug related it would most definitely be something else.

I’m inclined to believe your upbringing plays a much larger role.

rglover•9h ago
I came to comments first. Thank you for sharing this quote. Gave me a solid chuckle.

I think people are going nuts because we've drifted from the dock of a stable civilization. Institutions are a mess. Economy is a mess. Combine all of that together with the advent of social media making the creation of echo chambers (and the inevitable narcissism of "leaders" in those echo chambers) effortless and ~15 years later, we have this.

staunton•7h ago
People have been going nuts all throughout recorded history, that's really nothing new.

The only scary thing is that they have ever more power to change the world and influence others without being forced to grapple with that responsibility...

eli_gottlieb•9h ago
Who the fuck bases a Black Lotus cult on Mage: the Ascension rather than Magic: the Gathering? Is this just a mistake by the journalist?
kiitos•6h ago
i regret that i have but one upvote to give
AnimalMuppet•8h ago
From false premises, you can logically and rationally reach really wrong conclusions. If you have too much pride in your rationality, you may not be willing to say "I seem to have reached a really insane conclusion, maybe my premises are wrong". That is, the more you pride yourself on your rationalism, the more prone you may be to accepting a bogus conclusion if it is bogus because the premises are wrong.
DangitBobby•8h ago
Then again, most people tend to form really bogus beliefs without bothering to establish any premises. They may not even be internally consistent or align meaningfully with reality. I imagine having premises and thinking it through has a better track record of reaching conclusions consistent with reality.
lmm•37m ago
> I imagine having premises and thinking it through has a better track record of reaching conclusions consistent with reality.

Why do you imagine that? Have you tested it?

bitwize•7h ago
It's been like this a while. Have you heard the tale of the Final Fantasy House?: http://www.demon-sushi.com/warning/

https://www.vice.com/en/article/the-tale-of-the-final-fantas...

egypturnash•7h ago
I've always been under the impression that M:tA's rules of How Magic Works are inspired by actual mystical beliefs that people have practiced for centuries. It's probably about as much of a magical for mystical development as the GURPS Cyberpunk rulebook was for cybercrime but it's pointing at something that already exists and saying "this is a thing we are going to tell an exaggerated story about".

See for example "Reality Distortion Field": https://en.wikipedia.org/wiki/Reality_distortion_field

GeoAtreides•5h ago
>How did we get to this place with people going completely nuts like this?

God died and it's been rough going since then.

thrance•10h ago
Reminds me somewhat of the Culte de la Raison (Cult of Reason) birthed by the french revolution. It didn't last long.

https://en.wikipedia.org/wiki/Cult_of_Reason

amiga386•10h ago
See also Rational Magic: Why a Silicon Valley culture that was once obsessed with reason is going woo (2023)

https://www.thenewatlantis.com/publications/rational-magic

and its discussion on HN: https://news.ycombinator.com/item?id=35961817

gjsman-1000•10h ago
It’s especially popular in Silicon Valley.

Quite possibly, places like Reddit and Hacker News, are training for the required level of intellectual smugness, and certitude that you can dismiss every annoying argument with a logical fallacy.

That sounds smug of me, but I’m actually serious. One of their defects, is that once you memorize all the fallacies (“Appeal to authority,” “Ad hominem,”) you can easily reach the point where you more easily recognize the fallacies in everyone else’s arguments than your own. You more easily doubt other people’s cited authorities, than your own. You slap “appeal to authority” against a disliked opinion, while citing an authority next week for your own. It’s a fast path from there to perceived intellectual superiority, and an even faster path from there into delusion. Rational delusion.

shadowgovt•10h ago
It's generally worth remembering that some of the fallacies are actually structural, and some are rhetorical.

A contradiction creates a structural fallacy; if you find one, it's a fair belief that at least one of the supporting claims is false. In contrast, appeal to authority is probabilistic: we don't know, given the current context, if the authority is right, so they might be wrong... But we don't have time to read the universe into this situation so an appeal to authority is better than nothing.

... and this observation should be coupled with the observation that the school of rhetoric wasn't teaching a method for finding truth; it was teaching a method for beating an opponent in a legal argument. "Appeal to authority is a logical fallacy" is a great sword to bring to bear if your goal is to turn off the audience's ability to ask whether we should give the word of the environmental scientist and the washed-up TV actor equal weight on the topic of environmental science...

gjsman-1000•9h ago
… however, even that is up for debate. Maybe the TV actor in your own example is Al Gore filming An Inconvenient Truth and the environmental scientist was in the minority which isn’t so afraid of climate change. Fast forward to 2025, the scientist’s minority position was wrong, while Al Gore’s documentary was legally ruled to have 9 major errors; so you were stupid on both sides, with the TV actor being closer.
shadowgovt•7h ago
True, but this is where the Boolean nature of traditional logic can really trip up a person trying to operate in the real world.

These "maybes" are on the table. They are probably not the case.

(You end up with a spread of likelihoods and have to decide what to do with them. And law hates a spread of likelihoods and hates decision-by-coinflips, so one can see how rhetorical traditions grounded in legal persuasion tend towards encouraging Boolean outcomes; you can't find someone "a little guilty," at least not in the Western tradition of justice).

sunshowers•9h ago
While deployment of logical fallacies to win arguments is annoying at best, the far bigger problem is that people make those fallacies in the first place — such as not considering base rates.
bobson381•10h ago
I keep thinking about the first Avengers movie, when Loki is standing above everyone going "See, is this not your natural state?". There's some perverse security in not getting a choice, and these rationalist frameworks, based in logic, can lead in all kinds of crazy arbitrary directions - powered by nothing more than a refusal to suffer any kind of ambiguity.
csours•8h ago
Humans are not chickens, but we sure do seem to love having a pecking order.
lazide•7h ago
Making good decisions is hard, and being accountable to the results of them is not fun. Easier to outsource if you can.
snarf21•7h ago
I think it is more simple in that we love tribalism. A long time ago being part of a tribe had such huge benefits over going it alone that it was always worth any tradeoffs. We have a much better ability to go it alone now but we still love to belong to a group. Too often we pick a group based on a single shared belief and don't recognize all the baggage that comes along. Life is also too complicated today. It is difficult for someone to be knowledgeable in one topic let alone the 1000s that make up our society.
csours•4h ago
maybe the real innie/outie is the in-group/out-group. no spoilers, i haven't finished that show yet
jacquesm•7h ago
They mostly seem to lean that way because it gives them carte blanche to do as they please. It is just a modern version of 'god has led my hand'.
notahacker•6h ago
I agree with the religion comparison (the "rational" conclusions of rationalism tend towards millenarianism with a scifi flavour), but the people going furthest down that rabbit hole often aren't doing what they please: on the contrary they're spending disproportionate amounts of time worrying about armageddon and optimising for stuff other people simply don't care about, or in the case of the explicit cults being actively exploited. Seems like the typical in-too-deep rationalist gets seduced by the idea that others who scoff at their choices just aren't as smart and rational as them, as part of a package deal which treats everything from their scifi interests to their on-the-spectrum approach to analysing every interaction from first principles as great insights...
nathan_compton•10h ago
Thinking too hard about anything will drive you insane but I think the real issue here is that rationalists simply over-estimate both the power of rational thought and their ability to do it. If you think of people who tend to make that kind of mistake you can see how you get a lot of crazy groups.

I guess I'm a radical skeptic, secular humanist, utilitarianish sort of guy, but I'm not dumb enough to think throwing around the words "bayesian prior" and "posterior distribution" makes actually figuring out how something works or predicting the outcome of an intervention easy or certain. I've had a lot of life at this point and gotten to some level of mastery at a few things and my main conclusion is that most of the time its just hard to know stuff and that the single most common cognitive mistake people make is too much certainty.

nyeah•10h ago
I'm lucky enough work in a pretty rational place (small "r"). We're normally data-limited. Being "more rational" would mean taking/finding more of the right data, talking to the right people, reading the right stuff. Not just thinking harder and harder about what we already know.

There's a point where more passive thinking stops adding value and starts subtracting sanity. It's pretty easy to get to that point. We've all done it.

naasking•9h ago
> We're normally data-limited.

This is a common sentiment but is probably not entirely true. A great example is cosmology. Yes, more data would make some work easier, but astrophysicists and cosmologists have shown that you can gather and combine existing data and look at it in novel new ways to produce unexpected results, like place bounds that can include/exclude various theories.

I think a philosophy that encourages more analysis rather than sitting back on our laurels with an excuse that we need more data is good, as long as it's done transparently and honestly.

nyeah•9h ago
I suspect you didn't read some parts of my comment. I didn't say everyone in the world is always data-limited, I said we normally are where I work. I didn't recommend "sitting back on our laurels." I made very specific recommendations.

The qualifier "normally" already covers "not entirely true". Of course it's not entirely true. It's mostly true for us now. (In fact twenty years ago we used more numerical models than we do now, because we were facing more unsolved problems where the solution was pretty well knowable just by doing more complicated calculations, but without taking more data. Back then, when people started taking lots of data, it was often a total waste of time. But right now, most of those problems seem to be solved. We're facing different problems that seem much harder to model, so we rely more on data. This stage won't be permanent either.)

It's not a sentiment, it's a reality that we have to deal with.

naasking•8h ago
> It's not a sentiment, it's a reality that we have to deal with.

And I think you missed the main point of my reply: that people often think we need more data, but cleverness and ingenuity can often find a way to make meaningful progress with existing data. Obviously I can't make any definitive judgment about your specific case, but I'm skeptical of any claim that it's out of the realm of possibility that some genius like Einstein analyzed your problem could get no further than you have.

nyeah•6h ago
Apparently you will not be told what I'm saying.

I read your point and answered it twice. Your latest response seems to indicate that you're ignoring those responses. For example you seem to suggest that I'm "claim[ing] that it's out of the realm of possibility" for "Einstein" to make progress on our work without taking more data. But anyone can hit "parent" a few times and see what I actually claimed. I claimed "mostly" and "for us where I work". I took the time to repeat that for you. That time seems wasted now.

Perhaps you view "getting more data" as an extremely unpleasant activity, to be avoided at all costs? You may be an astronomer, for example. Or maybe you see taking more data before thinking as some kind of admission of defeat? We don't use that kind of metric. For us it's a question of the cheapest and fastest way to solve each problem.

if modeling is slower and more expensive than measuring, we measure. If not, we model. You do you.

spott•8h ago
This depends on what you are trying to figure out.

If you are talking about cosmology? Yea, you can look at existing data in new ways, cause you probably have enough data to do that safely.

If you are looking at human psychology? Looking at existing data in new ways is essentially p-hacking. And you probably won’t ever have enough data to define a “universal theory of the human mind”.

throw4847285•10h ago
People find academic philosophy impenetrable and pretentious, but it has two major advantages over rationalist cargo cults.

The first is diffusion of power. Social media is powered by charisma, and while it is certainly true that personality-based cults are nothing new, the internet makes it way easier to form one. Contrast that with academic philosophy. People can have their own little fiefdoms, and there is certainly abuse of power, but rarely concentrated in such a way that you see within rationalist communities.

The second (and more idealistic) is that the discipline of Philosophy is rooted in the Platonic/Socratic notion that "I know that I know nothing." People in academic philosophy are on the whole happy to provide a gloss on a gloss on some important thinker, or some kind of incremental improvement over somebody else's theory. This makes it extremely boring, and yet, not nearly as susceptible to delusions of grandeur. True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.

Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy. They mostly seem to dedicate their time to providing post-hoc justifications for the most banal unquestioned assumptions of their subset of contemporary society.

NoGravitas•9h ago
> Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy.

Taking academic philosophy seriously, at least as an historical phenomenon, would require being educated in the humanities, which is unpopular and low-status among Rationalists.

wizzwizz4•7h ago
> True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.

Nuh-uh! Eliezer Yudkowsky wrote that his mother made this mistake, so he's made sure to say things in the right order for the reader not to make this mistake. Therefore, true Rationalists™ are immune to this mistake. https://www.readthesequences.com/Knowing-About-Biases-Can-Hu...

sunshowers•9h ago
I don't disagree, but to steelman the case for (neo)rationalism: one of its fundamental contributions is that Bayes' theorem is extraordinarily important as a guide to reality, perhaps at the same level as the second law of thermodynamics; and that it is dramatically undervalued by larger society. I think that is all basically correct.

(I call it neorationalism because it is philosophically unrelated to the more traditional rationalism of Spinoza and Descartes.)

matthewdgreen•9h ago
I don't understand what "Bayes' theorem is a good way to process new data" (something that is not at all a contribution of neorationalism) has to do with "human beings are capable of using this process effectively at a conscious level to get to better mental models of the world." I think the rationalist community has a thing called "motte and bailey" that would apply here.
rpcope1•8h ago
Where Bayes' theorem applies in unconventional ways is not remotely novel for "rationalism" (maybe only in their strange busted hand wavy circle jerk "thought experiments"). This has been the domain of statistical mechanics long before Yudkowski and other cult leaders could even probably mouth "update your priors".
sunshowers•8h ago
I don't know, most of science still runs on frequentist statistics. Juries convict all the time on evidence that would never withstand a Bayesian analysis. The prosecutor's fallacy is real.
ImaCake•3h ago
Most science runs on BS with a cursory amount of statistics slapped on top so everyone can feel better about it. Weirdly enough, science still works despite not being rational. Rationalists seem to think science is logical when in reality it works for largely the same reasons the free-market does; throw shit at the wall and maybe support some of the stuff that works.
copularent•3h ago
As if these neorationalist are building a model and markov chain monte carlo sampling their life decisions.

That is the bullshit part.

sunshowers•1h ago
Agreed, yeah.
rpcope1•8h ago
Even the real progenitors of a lot of this sort of thought, like E.T. Jaynes, expoused significantly more skepticism than I've ever seen a "rationalist" use. I would even imagine if you asked almost all rationalists who E.T. Jaynes was (if they weren't well versed in statistical mechanics) they'd have no idea who he was or why his work was important to applying "Bayesianism".
randcraw•7h ago
The second-most common cognitive mistake we make has to be the failure to validate what we think we know -- is it actually true? The crux of being right isn't reasoning. It's avoiding dumb blunders based on falsehoods, both honest and dishonest. In today's political and media climate, I'd say dishonest falsehoods are a far greater cause for being wrong than irrationality.
incomingpain•10h ago
We live in an irrational time. It's unclear if it was simply under reported in history or social changes in the last ~50-75 years have had breaking consequences.

People are trying to make sense of this. For examples.

The Canadian government heavily subsidizes junk food, then spends heavily on healthcare because of the resulting illnesses. It restrict and limits healthy food through supply management and promotes a “food pyramid” favoring domestic unhealthy food. Meanwhile, it spends billions marketing healthy living, yet fines people up to $25,000 for hiking in forests and zones cities so driving is nearly mandatory.

Government is an easy target for irrational behaviours.

watwut•10h ago
Scientology is here since 1953 and it has similarly bonkers set of believes. And is huge.

Your rant about government or not being allowed to hike in some places in Canada is unrelated to the issue.

codr7•10h ago
There's nothing irrational about it, this is how you maximize power and profit at any and all costs.
incomingpain•10h ago
I completely get that point of view; and yes if that's the goal, it's completely rational.

But from a societal cohesion or perhaps even an ethical point of view it's just pure irrationality.

When typing the post, I was thinking, different levels of government, changing ideologies of politicians leaving inconsistent governance.

codr7•6h ago
I couldn't agree more, but we've long since given up our power collectively in hope of escaping responsibility.
noqc•10h ago
Perhaps I will get downvoted to death again for saying so, but the obvious answer is because the name "rationalist" is structurally indistinguishable from the name "scientology" or "the illuminati". You attract people who are desperate for an authority to appeal to, but for whatever reason are no longer affiliated with the church of their youth. Even a rationalist movement which held nothing as dogma would attract people seeking dogma, and dogma would form.

The article begins by saying the rationalist community was "drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences". Obviously the article intends to make the case that this is a cult, but it's already done with the argument at this point.

johnisgood•10h ago
I do not see any reasons for you to get down-voted.

I agree that the term "rationalist" would appeal to many people, and the obvious need to belong to a group plays a huge role.

noqc•9h ago
There are a lot of rationalists in this community. Pointing out that the entire thing is a cult attracts downvotes from people who wish to, for instance, avoid being identified with the offshoots.
6177c40f•7h ago
No, the downvotes are because rationalism isn't a cult and people take offense to being blatantly insulted. This article is about cults that are rationalism-adjacent, it's not claiming that rationalism is itself a cult.
noqc•3h ago
That's almost word for word what I said...
6177c40f•42m ago
You're right, I misread you.
mcv•10h ago
In fact, I'd go a step further and note the similarity with organized religion. People have a tendency to organize and dogmatize everything. The problem with religion is rarely the core ideas, but always the desire to use it as a basis for authority, to turn it dogmatic and ultimately form a power structure.

And I say this as a Christian. I often think that becoming a state religion was the worst thing that ever happened to Christianity, or any religion, because then it unavoidably becomes a tool for power and authority.

And doing the same with other ideas or ideologies is no different. Look at what happened to communism, capitalism, or almost any other secular idea you can think of: the moment it becomes established, accepted, and official, the corruption sets in.

handoflixue•9h ago
> Obviously the article intends to make the case that this is a cult

The author is a self-identified rationalist. This is explicitly established in the second sentence of the article. Given that, why in the world would you think they're trying to claim the whole movement is a cult?

Obviously you and I have very different definitions of "obvious"

noqc•9h ago
When I read the article in its entirety, I was pretty disappointed in its top-level introspection.

It seems to not be true, but I still maintain that it was obvious. Sometimes people don't pick the low-hanging fruit.

o11c•8h ago
> for whatever reason are no longer affiliated with the church of their youth.

This is the Internet, you're allowed to say "they are obsessed with unlimited drugs and weird sex things, far beyond what even the generally liberal society tolerates".

I'm increasingly convinced that every other part of "Rationalism" is just distraction or justification for those; certainly there's a conscious decision to minimize talking about this part on the Internet.

twic•6h ago
I strongly suspect there is heterogeneity here. An outer party of "genuine" rationalists who believe that learning to be a spreadsheet or whatever is going to let them save humanity, and an inner party who use the community to conceal some absolute shenanigans.
noqc•1h ago
No, I really mean atheists that crave religion.
cjs_ac•10h ago
Rationalism is the belief that reason is the primary path to knowledge, as opposed to, say, the observation that is championed by empiricism. It's a belief system that prioritises imposing its tenets on reality rather than asking reality what reality's tenets are. From the outset, it's inherently cult-like.
Ifkaluva•10h ago
That is the definition of “rationalism” as proposed by philosophers like Descartes and Kant, but I don’t think that is an accurate representation of the type of “rationalism” this article describes.

This article describes “rationalism” as described in LessWrong and the sequences by Eliezer Yudkowsky. A good amount of it based on empirical findings from psychology behavior science. It’s called “rationalism” because it seeks to correct common reasoning heuristics that are purported to lead to incorrect reasoning, not in contrast to empiricism.

FergusArgyll•10h ago
I was going to write a similar comment as op, so permit me to defend it:

Many of their "beliefs" - Super-duper intelligence, doom - are clearly not believed by the market; Observing the market is a kind of empiricism and it's completely discounted by the lw-ers

glenstein•10h ago
Agreed, I appreciate that there's a conceptual distinction between the philosophical versions of rationalism and empiricism, but what's being talked about here is a conception that (again, at least notionally) is interested in and compatible with both.

I am pretty sure many of the LessWrong posts are about how to understand the meaning of different types of data and are very much about examining, developing, criticizing a rich variety of empirical attitudes.

gethly•10h ago
But you cannot have reason without substantial proof of how things behave by observing them in the first place. Reason is simply a logical approach to yes and no questions where you factually know, from observation of past events, how things work. And therefore you can simulate an outcome by the exercise of reasoning applied onto a situation that you have not yet observed and come to a logical outcome, given the set of rules and presumptions.
handoflixue•9h ago
Rationalists, in this case, refers specifically to the community clustered around LessWrong, which explicitly and repeatedly emphasizes points like "you can't claim to have a well grounded belief if you don't actually have empirical evidence for it" (https://www.lesswrong.com/w/evidence for a quick overview of some of the basic posts on that topic)

To quote one of the core foundational articles: "Before you try mapping an unseen territory, pour some water into a cup at room temperature and wait until it spontaneously freezes before proceeding. That way you can be sure the general trick—ignoring infinitesimally tiny probabilities of success—is working properly." (https://www.lesswrong.com/posts/eY45uCCX7DdwJ4Jha/no-one-can...)

One can argue how well the community absorbs the lesson, but this certainly seems to be a much higher standard than average.

AIPedant•10h ago
I think I found the problem!

  The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally
I actually don't mind Yudkowski as an individual - I think he is almost always wrong and undeservedly arrogant, but mostly sincere. Yet treating him as an AI researcher and serious philosopher (as opposed to a sci-fi essayist and self-help writer) is the kind of slippery foundation that less scrupulous people can build cults from. (See also Maharishi Mahesh Yogi and related trends - often it is just a bit of spiritual goofiness as with David Lynch, sometimes you get a Charles Manson.)
polytely•10h ago
Don't forget the biggest scifi guy turned cult leader of all L. Ron Hubbard
AIPedant•10h ago
I don't think Yudkowski is at all like L. Ron Hubbard. Hubbard was insane and pure evil. Yudkowski seems like a decent and basically reasonable guy, he's just kind of a blowhard and he's wrong about the science.

L. Ron Hubbard is more like the Zizians.

pingou•9h ago
I don't have a horse in the battle but could you provide a few examples where he was wrong?
bglazer•9h ago
Here's one: Yudkowsky has been confidently asserting (for years) that AI will extinct humanity because it will learn how to make nanomachines using "strong" covalent bonds rather than the "weak" van der Waals forces used by biological systems like proteins. I'm certain that knowledgeable biologists/physicists have tried to explain to him why this belief is basically nonsense, but he just keeps repeating it. Heck there's even a LessWrong post that lays it out quite well [1]. This points to a general disregard for detailed knowledge of existing things and a preference for "first principles" beliefs, no matter how wrong they are.

[1] https://www.lesswrong.com/posts/8viKzSrYhb6EFk6wg/why-yudkow...

12_throw_away•1h ago
Dear god. The linked article is a good takedown of this "idea," but I would like to pile on: biological systems are in fact extremely good at covalent chemistry, usually via extraordinarily powerful nanomachines called "enzymes". No, they are (usually) not building totally rigid condensed matter structures, but .. why would they? Why would that be better?

I'm reminded of a silly social science article I read, quite a long time ago. It suggested that physicists only like to study condensed matter crystals because physics is a male-dominated field, and crystals are hard rocks, and, um ... men like to think about their rock-hard penises, I guess. Now, this hypothesis obviously does not survive cursory inspection - if we're gendering natural phenomena studied by physicists, are waves male? Are fluid dynamics male?

However, Mr. Yudowsky's weird hangups here around rigidity and hardness have me adjusting my priors.

fulafel•9h ago
How has he fared in the fields of philosophy and AI research in terms of peer review, is there some kind of roundup or survey around about this?
iwontberude•10h ago
They watched too much eXistenZ
lenerdenator•10h ago
Because humans like people who promise answers.
andy99•10h ago
Boring as it is, this is the answer. It's just more religion.

  Church, cult, cult, church. So we'll get bored someplace else every Sunday. Does this really change our everyday lives?
optimalsolver•10h ago
Funnily enough, the actress who voiced this line is a Scientologist:

https://en.wikipedia.org/wiki/Nancy_Cartwright#Personal_life

andy99•10h ago
I think they were making fun of the "Moonies" so she was probably able to rationalize it. Pretty sure Isaac Hayes quit South Park over their making fun of scientologists.
ZeroGravitas•9h ago
I read recently that he suffered a serious medical event atound that time and it was actually cult members speaking on his behalf that withdrew him from the show.

I think it was a relative of his claiming this.

saasapologist•10h ago
I think we've strayed too far from the Aristotelian dynamics of the self.

Outside of sexuality and the proclivities of their leaders, emphasis on physical domination of the self is lacking. The brain runs wild, the spirit remains aimless.

In the Bay, the difference between the somewhat well-adjusted "rationalists" and those very much "in the mush" is whether or not someone tells you they're in SF or "on the Berkeley side of things"

j_m_b•10h ago
> One way that thinking for yourself goes wrong is that you realize your society is wrong about something, don’t realize that you can’t outperform it, and wind up even wronger.

many such cases

quantummagic•10h ago
It's almost the defining characteristic of our time.
teiferer•6h ago
Tell-tale slogan: "Let's derive from first principles"
shadowgovt•10h ago
It is an unfortunate reality of our existence that sometimes Chesterton actually did build that fence for a good reason, a good reason that's still here.

(One of my favorite TED talks was about a failed experiment in introducing traditional Western agriculture to a people in Zambia. It turns out when you concentrate too much food in one place, the hippos come and eat it all and people can't actually out-fight hippos in large numbers. In hindsight, the people running the program should have asked how likely it was that folks in a region that had exposure to other people's agriculture for thousands of years, hadn't ever, you know... tried it. https://www.ted.com/talks/ernesto_sirolli_want_to_help_someo...)

ljlolel•10h ago
TEDx
bobson381•10h ago
You sound like you'd like the book Seeing like a State.
im3w1l•9h ago
Shoot the hippos to death for even more food. If it doesn't seem to work it's just a matter of having more and bigger guns.
HDThoreaun•8h ago
Why didnt they kill the hippos like we killed the buffalo?
lesuorac•7h ago
Hippos are more dangerous than emus.

https://en.wikipedia.org/wiki/Emu_War

HDThoreaun•7h ago
My understanding of the emu war is that they werent dangerous so much as quick to multiply. The army couldnt whack the moles fast enough. Hippos dont strike me as animals that can go underground when threatened
NoGravitas•9h ago
Capital-R Rationalism also encourages you to think you can outperform it, by being smart and reasoning from first principles. That was the idea behind MetaMed, founded by LessWronger Michael Vassar - that being trained in rationalism made you better at medical research and consulting than medical school or clinical experience. Fortunately they went out of business before racking up a body count.
rpcope1•9h ago
One lesson I've learned and seen a lot in my life is that understanding that something is wrong or what's wrong about it, and being able to come up with a better solution are distinct, and the latter is often much harder. It seems often that those that are best able to describe the problem often don't overlap much with those that can figure out how to solve, even though they think they can.
kiitos•6h ago
indeed

see: bitcoin

numbsafari•10h ago
Why are so many cults founded on fear or hate?

Because empathy is hard.

meroes•10h ago
It grew out of many different threads: different websites, communities, etc all around the same time. I noticed it contemporaneously in the philosophy world where Nick Bostrom’s Simulation argument was boosted more than it deserved (like everyone was just accepting it at the lay-level). Looking back I see it also developed from less wrong and other sites, but I was wondering what was going on with simulations taking over philosophy talk. Now I see how it all coalesced.

All of it has the appearance of sounding so smart, and a few sites were genuine. But it got taken over.

potatolicious•9h ago
Yeah, a lot of the comments here are really just addressing cults writ large and opposed to why this one was particularly successful.

A significant part of this is the intersection of the cult with money and status - this stuff really took off once prominent SV personalities became associated with it, and got turbocharged when it started intersecting with the angel/incubator/VC scene, when there was implicit money involved.

It's unusually successful because -- for a time at least -- there was status (and maybe money) in carrying water for it.

jacquesm•7h ago
Paypal will be traced as the root cause of many of our future troubles.
varjag•4h ago
Wish I could upvote this twice. It's like intersectionality for evil.
wredcoll•2h ago
https://en.m.wikipedia.org/wiki/Barth%C3%A9lemy-Prosper_Enfa...

Sometimes history really does rhyme.

> Enfantin and Amand Bazard were proclaimed Pères Suprêmes ("Supreme Fathers") – a union which was, however, only nominal, as a divergence was already manifest. Bazard, who concentrated on organizing the group, had devoted himself to political reform, while Enfantin, who favoured teaching and preaching, dedicated his time to social and moral change. The antagonism was widened by Enfantin's announcement of his theory of the relation of man and woman, which would substitute for the "tyranny of marriage" a system of "free love".[1]

6177c40f•8h ago
To be clear, this article isn't calling rationalism a cult, it's about cults that have some sort of association with rationalism (social connection and/or ideology derived from rationalist concepts), e.g. the Zizians.
throwanem•5h ago
This article attempts to establish disjoint categories "good rationalist" and "cultist." Its authorship, and its appearance in the cope publication of the "please take us seriously" rationalist faction, speak volumes of how well it is likely to succeed in that project.
ImaCake•3h ago
Not sure why you got down voted for this. The opening paragraph of the article reads as suspicious to the observant outsider:

>The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally.

Anyone who had just read a lot about Scientology would read that and have alarm bells ringing.

meowface•2h ago
Asterisk magazine is basically the unofficial magazine for the rationalist community and the author, Ozy Brennan, is a prominent rationalist blogger. Of course the piece is pro-rationalism. It's investigating why rationalism seems to spawn these small cultish offshoots, not trying to criticize rationalism.
throwanem•2h ago
"Unofficial?" Was that a recent change? But my point is that because the author neither can nor will criticize the fundamental axioms or desiderata of the movement, their analysis of how or why it spins off cults is necessarily footless. In practice the result amounts to a collection of excuses mostly from anonymees, whom we are assured have sufficient authority to reassure us this smoke arises from no fire. But of course it's only when Kirstie Alley does something like this we're meant to look askance.
6177c40f•43m ago
I think it's a meaningful distinction- most rationalists aren't running murder cults.
throwanem•29m ago
That we know about, I suppose. We didn't know at one point there were any outright rationalist cults, after all, whether involved in sex, murder, both, or otherwise. That is, we didn't know there were subsets of self-identifying "rationalists" so erroneous in their axioms and tendentious in their analysis as to succeed in putting off others.

But a movement, that demonstrates so remarkably elevated rate of generating harmful beliefs in action as this, warrants exactly the sort of increased scrutiny this article vainly strives to deflect. That effort is in itself interesting, as such efforts always are.

6177c40f•24m ago
I mean, as a rationalist, I can assure you it's not nearly as sinister a group as you seem to make it out to be, believe it or not. Besides, the explanation is simpler than this article makes it out to be- most rationalists are from California, California is the origin of lots of cults.
throwanem•13m ago
> Besides, the explanation is simpler than this article makes it out to be- most rationalists are from California, California is the origin of lots of cults

This isn't the defense of rationalism you seem to imagine it to be.

I don't think the modal rationalist is sinister. I think he's ignorant, misguided, nearly wholly lacking in experience, deeply insecure about it, and overall just excessively resistant to the idea that it is really possible, on any matter of serious import, for his perspective radically to lack merit. Unfortunately, this latter condition proves very reliably also the mode.

6177c40f•33s ago
> his perspective radically to lack merit

What perspective would that be?

alphazard•10h ago
The terminology here is worth noting. Is a Rationalist Cult a cult that practices Rationalism according to third parties, or is it a cult that says they are Rationalist?

Clearly all of these groups that believe in demons or realities dictated by tabletop games are not what third parties would call Rationalist. They might call themselves that.

There are some pretty simple tests that can out these groups as not rational. None of these people have ever seen a demon, so world models including demons have never predicted any of their sense data. I doubt these people would be willing to make any bets about when or if a demon will show up. Many of us would be glad to make a market concerning predictions made by tabletop games about physical phenomenon.

ameliaquining•10h ago
The article is talking about cults that arose out of the rationalist social milieu, which is a separate question from whether the cult's beliefs qualify as "rationalist" in some sense (a question that usually has no objective answer anyway).
glenstein•10h ago
Yeah, I would say the groups in question are notionally, aspirationally rational and I would hate for the takeaway to be disengagement from principles of critical thinking and skeptical thinking writ large.

Which, to me, raises the fascinating question of what does a "good" version look like, of groups and group dynamics centered around a shared interest in best practices associated with critical thinking?

At a first impression, I think maybe these virtues (which are real!) disappear into the background of other, more applied specializations, whether professions, hobbies, backyard family barbecues.

alphazard•9h ago
It would seem like the quintessential Rationalist institution to congregate around is the prediction market. Status in the community has to be derived from a history of making good bets (PnL as a %, not in absolute terms). And the sense of community would come from (measurably) more rational people teaching (measurably) less rational people how to be more rational.
handoflixue•9h ago
The founder of LessWrong / The Rationalist movement would absolutely agree with you here, and has written numerous fanfics about a hypothetical alien society ("Dath Ilan") where those are fairly central.
Barrin92•10h ago
>so world models including demons have never predicted any of their sense data.

There's a reason they call themselves "rationalists" instead of empiricists or positivists. They perfectly inverted Hume ("reason is, and ought only to be the slave of the passions")

These kinds of harebrained views aren't an accident but a product of rationalism. The idea that intellect is quasi infinite and that the world can be mirrored in the mind is not running contradictory to, but just the most extreme form of rationalism taken to its conclusion, and of course deeply religious, hence the constant fantasies about AI divinities and singularities.

wiredfool•10h ago
It's really worth reading up on the techniques from Large Group Awareness Training so that you can recognize them when they pop up.

Once you see them listed (social pressure, sleep deprivation, control of drinking/bathroom, control of language/terminology, long exhausting activities, financial buy in, etc) and see where they've been used in cults and other cult adjacent things it's a little bit of a warning signal when you run across them IRL.

derektank•6h ago
Related, the BITE model of authoritarian control is also a useful framework for identifying malignant group behavior. It's amazing how consistent these are across groups and cultures, from Mao's inner circle to NXIVM and on.

https://freedomofmind.com/cult-mind-control/bite-model-pdf-d...

keybored•10h ago
Cue all the surface-level “tribalism/loneliness/hooman nature” comments instead of the simple analysis that Rationalism (this kind) is severely brain-broken and irredeemable and will just foster even worse outcomes in a group setting. It’s a bit too close to home (ideologically) to get a somewhat detached analysis.
bubblyworld•10h ago
What is the base rate here? Hard to know the scope of the problem without knowing how many non-rationalists (is that even a coherent group of people?) end up forming weird cults, as a comparison. My impression is that crazy beliefs are common amongst everybody.

A much simpler theory is that rationalists are mostly normal people, and normal people tend to form cults.

glenstein•9h ago
I was wondering about this too. You could also say it's a sturgeon's law question.

They do note at the beginning of the article that many, if not most such groups have reasonably normal dynamics, for what it's worth. But I think there's a legitimate question of whether we ought to expect groups centered on rational thinking to be better able to escape group dynamics we associate with irrationality.

zzzeek•10h ago
because humans are biological creatures iterating through complex chemical processes that are attempting to allow a large organism to survive and reproduce within the specific ecosystem provided by the Earth in the present day. "Rational reasoning" is a quaint side effect that sometimes is emergent from the nervous system of these organisms, but it's nothing more than that. It's normal that the surviving/reproducing organism's emergent side effect of "rational thought", when it is particularly intense, will self-refer to the organism and act as though it has some kind of dominion over the organism itself, but this is, like the rationalism itself, just an emergent effect that is accidental and transient. Same as if you see a cloud that looks like an elephant (it's still just a cloud).
rkapsoro•10h ago
Something like 15 years ago I once went to a Less Wrong/Overcoming Bias meetup in my town after being a reader of Yudkowsky's blog for some years. I was like, Bayesian Conspiracy, cool, right?

The group was weird and involved quite a lot of creepy oversharing. I didn't return.

dfabulich•10h ago
The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.

Well, it turns out that intuition and long-lived cultural norms often have rational justifications, but individuals may not know what they are, and norms/intuitions provide useful antibodies against narcissist would-be cult leaders.

Can you find the "rational" justification not to isolate yourself from non-Rationalists, not to live with them in a polycule, and not to take a bunch of psychedelic drugs with them? If you can't solve that puzzle, you're in danger of letting the group take advantage of you.

StevenWaterman•9h ago
Yeah, I think this is exactly it. If something sounds extremely stupid, or if everyone around you says it's extremely stupid, it probably is. If you can't justify it, it's probably because you have failed to find the reason it's stupid, not because it's actually genius.

And the crazy thing is, none of that is fundamentally opposed to rationalism. You can be a rationalist who ascribes value to gut instinct and societal norms. Those are the product of millions of years of pre-training.

I have spent a fair bit of time thinking about the meaning of life. And my conclusions have been pretty crazy. But they sound insane, so until I figure out why they sound insane, I'm not acting on those conclusions. And I'm definitely not surrounding myself with people who take those conclusions seriously.

empath75•9h ago
> The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.

The game as it is _actually_ played is that you use rationalist arguments to justify your pre-existing gut intuitions and personal biases.

NoGravitas•9h ago
Or worse - to justify the gut intuitions and personal biases of your cult leader.
xbar•6h ago
Which is to say, Rationalism is easily abused to justify any behavior contrary to its own tenets, just like any other -ism.
copularent•3h ago
Exactly. Humans are rationalizers. Operate on pre-existing gut intuitions and biases then invent after the fact rational sounding justifications.

I guess Pareto wasn't on the reading list for these intellectual frauds.

Those are actually the priors being updated lol.

kelseyfrog•9h ago
> The whole game of Rationalism is that you should ignore gut intuitions and cultural norms that you can't justify with rational arguments.

Specifically, rationalism spends a lot of time about priors, but a sneaky thing happens that I call the 'double update'.

Bayesian updating works when you update your genuine prior believe with new evidence. No one disagrees with this, and sometimes it's easy and sometimes it's difficult to do.

What Rationalists often end up doing is relaxing their priors - intuition, personal experience, cultural norms - and then updating. They often think of this as one update, but what it is is two. The first update, relaxing priors, isn't associated with evidence. It's part of the community norms. There is an implicit belief that by relaxing one's priors you're more open to reality. The real result though, is that it sends people wildly off course. Care in point: all the cults.

Consider the pre-tipped scale. You suspect the scale reads a little low, so before weighing you tilt it slightly to "correct" for that bias. Then you pour in flour until the dial says you've hit the target weight. You’ve followed the numbers exactly, but because you started from a tipped scale, you've ended up with twice the flour the recipe called for.

Trying to correct for bias by relaxing priors is updating using evidence, not just because everyone is doing it.

ewoodrich•7h ago
Thanks, that's a fantastic description of a phenomenon I've observed but couldn't quite put my finger on.
windowshopping•4h ago
> Consider the pre-tipped scale. You suspect the scale reads a little low, so before weighing you tilt it slightly to "correct" for that bias. Then you pour in flour until the dial says you've hit the target weight. You’ve followed the numbers exactly, but because you started from a tipped scale, you've ended up with twice the flour the recipe called for.

I'm not following this example at all. If you've zero'd out the scale by tilting, why would adding flour until it reads 1g lead to 2g of flour?

kelseyfrog•2h ago
I agree. It's not the best metaphor.

I played around with various metaphors but most of them felt various degrees of worse. The idea of relaxing priors and then doing an evidence-based update while thinking it's genuinely a single update is a difficult thing to capture metaphorically.

Happy to hear better suggestions.

EDIT: Maybe something more like this:

Picture your belief as a shotgun aimed at the truth:

    Aim direction = your best current guess.

    Spread = your precision.

    Evidence = the pull that says "turn this much" and "widen/narrow this much."
The correct move is one clean Bayesian shot.

Hold your aim where it is. Evidence arrives. Rotate and resize the spread in one simultaneous posterior jump determined by the actual likelihood ratio in front of you.

The stupid move? The move that Rationalists love to disguise as humility? It's to first relax your spread "to be open-minded," and then apply the update. You've just secretly told the math, "Give this evidence more weight than it deserves." And then you wonder why you keep overshooting, drifting into confident nonsense.

If you think your prior is overconfident, that is itself evidence. Evidence about your meta-level epistemic reliability. Feed it into the update properly. Do not amputate it ahead of time because "priors are bias." Bias is bad, yes, but closing your eyes and spinning around with shotgun in hand ie: double updating is not an effective method at removing bias.

twic•6h ago
From another piece about the Zizians [1]:

> The ability to dismiss an argument with a “that sounds nuts,” without needing recourse to a point-by-point rebuttal, is anathema to the rationalist project. But it’s a pretty important skill to have if you want to avoid joining cults.

[1] https://maxread.substack.com/p/the-zizians-and-the-rationali...

JohnMakin•10h ago
One of a few issues I have with groups like these, is that they often confidently and aggressively spew a set of beliefs that on their face logically follow from one another, until you realize they are built on a set of axioms that are either entirely untested or outright nonsense. This is common everywhere, but I feel especially pronounced in communities like this. It also involves quite a bit of navel gazing that makes me feel a little sick participating in.

The smartest people I have ever known have been profoundly unsure of their beliefs and what they know. I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

bobson381•10h ago
There should be an extremist cult of people who are certain only that uncertainty is the only certain thing
ameliaquining•9h ago
Like Robert Anton Wilson if he were way less chill, perhaps.
rpcope1•9h ago
More people should read Sextus Empiricus as he's basically the O.G. Phyrronist skeptic and goes pretty hard on this very train of thought.
bobson381•9h ago
Cool. Any specific recs or places to start with him?
rpcope1•9h ago
Probably the Hackett book, "Sextus Empiricus: Selections from the Major Writings on Scepticism"
bobson381•9h ago
Thanks!
Telemakhos•7h ago
If I remember my Gellius, it was the Academic Skeptics who claimed that the only certainty was uncertainty; the Pyrrhonists, in opposition, denied that one could be certain about the certainty of uncertainty.
arwhatever•9h ago
“Oh, that must be exhausting.”
saltcured•9h ago
There would be, except we're all very much on the fence about whether it is the right cult for us.
hungmung•8h ago
What makes you so certain there isn't? A group that has a deep understanding fnord of uncertainty would probably like to work behind the scenes to achieve their goals.
dcminter•8h ago
One might even call them illuminati? :D
cwmoore•7h ago
The Fnords do keep a lower profile.
card_zero•8h ago
The Snatter Goblins?

https://archive.org/details/goblinsoflabyrin0000frou/page/10...

pancakemouse•7h ago
My favourite bumper sticker, "Militant Agnostic. I don't know, and neither do you."
bobson381•7h ago
I heard about this the other day! I think I need one.
jazzyjackson•7h ago
A Wonderful Phrase by Gandhi

  I do dimly perceive
  that while everything around me is ever-changing,
  ever-dying there is,
  underlying all that change,
  a living power
  that is changeless,
  that holds all together,
  that creates,
  dissolves,
  and recreates
tim333•7h ago
Socrates was fairly close to that.
freedomben•6h ago
My thought as well! I can't remember names at the moment, but there were some cults that spun off from Socrates. Unfortunately they also adopted his practice of never writing anything down, so we don't know a whole lot about them
JTbane•6h ago
"I have no strong feelings one way or the other." thunderous applause
tomjakubowski•5h ago
https://realworldrisk.com/
mapontosevenths•4h ago
There already is, they're called "Politicians."
ctoth•9h ago
> I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

Are you certain about this?

JohnMakin•9h ago
no
idontwantthis•8h ago
Suspicious implies uncertain. It’s not immediate rejection.
teddyh•8h ago
All I know is that I know nothing.
p1esk•3h ago
How do you know?
adrianN•8h ago
Your own state of mind is one of the easiest things to be fairly certain about.
ants_everywhere•8h ago
The fact that this is false is one of the oldest findings of research psychology
PaulHoule•6h ago
Marvin Minsky wrote forcefully [1] about this in The Society of Mind and went so far to say that trying to observe yourself (e.g. meditation) might be harmful.

Freud of course discovered a certain world of the unconscious but untrained [2] you would certainly struggle to explain how you know sentence S is grammatical and S' is not, or what it is you do when you walk.

If you did meditation or psychoanalysis or some other practice to understand yourself better it would take years.

[1] whether or not it is true.

[2] the "scientific" explanation you'd have if you're trained may or may not be true since it can't be used to program a computer to do it

lazide•8h ago
said no one familiar with their own mind, ever!
tshaddox•5h ago
Well you could be a critical rationalist and do away with the notion of "certainty" or any sort of justification or privileged source of knowledge (including "rationality").
at-fates-hands•3h ago
Isaac Newton would like to have a word.
elictronic•3h ago
I am not a big fan of alchemy, thank you though.
jl6•8h ago
I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.

danaris•8h ago
> I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

Yeah, this is a pattern I've seen a lot of recently—especially in discussions about LLMs and the supposed inevitability of AGI (and the Singularity). This is a good description of it.

kergonath•7h ago
Another annoying one is the simulation theory group. They know just enough about Physics to build sophisticated mental constructs without understanding how flimsy the foundations are or how their logical steps are actually unproven hypotheses.
JohnMakin•7h ago
Agreed. This one is especially annoying to me and dear to my heart, because I enjoy discussing the philosophy behind this, but it devolves into weird discussions and conclusions fairly quickly without much effort at all. I particularly enjoy the tenets of certain sects of buddhism and how they view these things, but you'll get a lot of people that are doing a really pseudo-intellectual version of the Matrix where they are the main character.
spopejoy•2h ago
You might have just explained the phenomenon of AI doomsayers overlapping with ea/rat types, which I otherwise found inexplicable. EA/Rs seem kind of appalingly positivist otherwise.
kergonath•7h ago
> I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

I really like your way of putting it. It’s a fundamental fallacy to assume certainty when trying to predict the future. Because, as you say, uncertainty compounds over time, all prediction models are chaotic. It’s usually associated with some form of Dunning-Kruger, where people know just enough to have ideas but not enough to understand where they might fail (thus vastly underestimating uncertainty at each step), or just lacking imagination.

ramenbytes•5h ago
Deep Space 9 had an episode dealing with something similar. Superintelligent beings determine that a situation is hopeless and act accordingly. The normal beings take issue with the actions of the Superintelligents. The normal beings turn out to be right.
BeFlatXIII•7h ago
> Not that non-rationalists are any better at reasoning, but non-rationalists do at least benefit from some intellectual humility.

Non-rationalists are forced to use their physical senses more often because they can't follow the chain of logic as far. This is to their advantage. Empiricism > rationalism.

om8•5h ago
Good rationalism includes empiricism though
whatevertrevor•4h ago
That conclusion presupposes that rationality and empiricism are at odds or mutually incompatible somehow. Any rational position worth listening to, about any testable hypothesis, is hand in hand with empirical thinking.
guerrilla•4h ago
In traditional philosophy, rationalism and empiricism are at odds; they are essentially diametrically opposed. Rationalism prioritizes a priori reasoning while empiricism prioritizes a posteriori reasoning. You can prioritize both equally but that is neither rationalism nor empiricism in the traditional terminology. The current rationalist movement has no relation to that original rationalist movement, so the words don't actually mean the same thing. In fact, the majority of participants in the current movement seem ignorant of the historical dispute and its implications, hence the misuse of the word.
BlueTemplar•4h ago
Yeah, Stanford has a good recap :

https://plato.stanford.edu/entries/rationalism-empiricism/

(Note also how the context is French vs British, and the French basically lost with Napoleon, so the current "rationalists" seem to be more likely to be heirs to empiricism instead.)

whatevertrevor•4h ago
Thank you for clarifying.

That does compute with what I thought the "Rationalist" movement as covered by the article was about. I didn't peg them as pure a priori thinkers as you put it. I suppose my comment still holds, assuming the rationalist in this context refers to the version of "Rationalism" being discussed in the article as opposed to the traditional one.

analog31•7h ago
Perhaps part of being rational, as opposed to rationalist, is having a sense of when to override the conclusions of seemingly logical arguments.
1attice•5h ago
In philosophy grad school, we described this as 'being reasonable' as opposed to 'being rational'.

That said, big-R Rationalism (the Lesswrong/Yudkowsky/Ziz social phenomenon) has very little in common with what we've standardly called 'rationalism'; trained philosophers tend to wince a little bit when we come into contact with these groups (who are nevertheless chockablock with fascinating personalities and compelling aesthetics.)

From my perspective (and I have only glancing contact,) these mostly seem to be _cults of consequentialism_, an epithet I'd also use for Effective Altruists.

Consequentialism has been making young people say and do daft things for hundreds of years -- Dostoevsky's _Crime and Punishment_ being the best character sketch I can think of.

While there are plenty of non-religious (and thus, small-r rationalist) alternatives to consequentialism, none of them seem to make it past the threshold in these communities.

The other codesmell these big-R rationalist groups have for me, and that which this article correctly flags, is their weaponization of psychology -- while I don't necessarily doubt the findings of sociology, psychology, etc, I wonder if they necessarily furnish useful tools for personal improvement. For example, memorizing a list of biases that people can potentially have is like numbering the stars in the sky; to me, it seems like this is a cargo-cultish transposition of the act of finding _fallacies in arguments_ into the domain of finding _faults in persons_.

And that's a relatively mild use of psychology. I simply can't imagine how annoying it would be to live in a household where everyone had memorized everything from connection theory to attachment theory to narrative therapy and routinely deployed hot takes on one another.

In actual philosophical discussion, back at the academy, psychologizing was considered 'below the belt', and would result in an intervention by the ref. Sometimes this was explicitly associated with something we called 'the Principle of Charity', which is that, out of an abundance of epistemic caution, you commit to always interpreting the motives and interests of your interlocutor in the kindest light possible, whether in 'steel manning' their arguments, or turning a strategically blind eye to bad behaviour in conversation.

The importance Principle of Charity is probably the most enduring lesson I took from my decade-long sojurn among the philosophers, and mutual psychological dissection is anathema to it.

rendx•4h ago
> to me, it seems like this is a cargo-cultish transposition of the act of finding _fallacies in arguments_ into the domain of finding _faults in persons_.

Well put, thanks!

throw4847285•1h ago
I actually think that the fact that rationalists use the term "steel manning" betrays a lack of charity.

If the only thing you owe your interlocutor is to use your "prodigious intellect" to restate their own argument in the way that sounds the most convincing to you, maybe you are in fact a terrible listener.

1attice•1h ago
Just so. I hate this term, and for essentially this reason, but it has undeniable currency right now; I was writing to be understood.
Dylan16807•35m ago
Listening to other viewpoints is hard. Restating is a good tool to improve listening and understanding. I don't agree with this criticism at all, since that "prodigious intellect" bit isn't inherent to the term.
MajimasEyepatch•7h ago
I feel this way about some of the more extreme effective altruists. There is no room for uncertainty or recognition of the way that errors compound.

- "We should focus our charitable endeavors on the problems that are most impactful, like eradicating preventable diseases in poor countries." Cool, I'm on board.

- "I should do the job that makes the absolute most amount of money possible, like starting a crypto exchange, so that I can use my vast wealth in the most effective way." Maybe? If you like crypto, go for it, I guess, but I don't think that's the only way to live, and I'm not frankly willing to trust the infallibility and incorruptibility of these so-called geniuses.

- "There are many billions more people who will be born in the future than those people who are alive today. Therefore, we should focus on long-term problems over short-term ones because the long-term ones will affect far more people." Long-term problems are obviously important, but the further we get into the future, the less certain we can be about our projections. We're not even good at seeing five years into the future. We should have very little faith in some billionaire tech bro insisting that their projections about the 22nd century are correct (especially when those projections just so happen to show that the best thing you can do in the present is buy the products that said tech bro is selling).

xg15•6h ago
The "longtermism" idea never made sense to me: So we should sacrifice the present to save the future. Alright. But then those future descendants would also have to sacrifice their present to save their future, etc. So by that logic, there could never be a time that was not full of misery. So then why do all of that stuff?
twic•6h ago
At some point in the future, there won't be more people who will live in the future than live in the present, at which point you are allowed to improve conditions today. Of course, by that point the human race is nearly finished, but hey.

That said, if they really thought hard about this problem, they would have come to a different conclusion:

https://theconversation.com/solve-suffering-by-blowing-up-th...

xg15•6h ago
Some time after we've colonized half the observable universe. Got it.
vharuck•6h ago
Zeno's poverty
rawgabbit•6h ago
To me it is disguised way of saying the ends justify the means. Sure, we murder a few people today but think of the utopian paradise we are building for the future.
cogman10•2h ago
From my observation, that "building the future" isn't something any of them are actually doing. Instead, the concept that "we might someday do something good with the wealth and power we accrue" seems to be the thought that allows the pillaging. It's a way to feel morally superior without actually doing anything morally superior.
vlowther•6h ago
"I came up with a step-by-step plan to achieve World Peace, and now I am on a government watchlist!"
to11mtm•4h ago
Well, there's a balance to be had. Do the most good you can while still being able to survive the rat race.

However, people are bad at that.

I'll give an interesting example.

Hybrid Cars. Modern proper HEVs[0] usually benefit to their owners, both by virtue of better fuel economy as well as in most cases being overall more reliable than a normal car.

And, they are better on CO2 emissions and lower our oil consumption.

And yet most carmakers as well as consumers have been very slow to adopt. On the consumer side we are finally to where we can have hybrid trucks that can get 36-40MPG capable of towing 4000 pounds or hauling over 1000 pounds in the bed [1] we have hybrid minivans capable of 35MPG for transporting groups of people, we have hybrid sedans getting 50+ and Small SUVs getting 35-40+MPG for people who need a more normal 'people' car. And while they are selling better it's insane that it took as long as it has to get here.

The main 'misery' you experience at that point, is that you're driving the same car as a lot of other people and it's not as exciting [2] as something with more power than most people know what to do with.

And hell, as they say in investing, sometimes the market can be irrational longer than you can stay solvent. E.x. was it truly worth it to Hydro-Quebec to sit on LiFePO patents the way they did vs just figuring out licensing terms that got them a little bit of money to then properly accelerate adoption of Hybrids/EVs/etc?

[0] - By this I mean Something like Toyota's HSD style setup used by Ford and Subaru, or Honda or Hyundai/Kia's setup where there's still a more normal transmission involved.

[1] - Ford advertises up to 1500 pounds, but I feel like the GVWR allows for a 25 pound driver at that point.

[2] - I feel like there's ways to make an exciting hybrid, but until there's a critical mass or Stellantis gets their act together, it won't happen...

BlueTemplar•3h ago
Not that these technologies don't have anything to bring, but any discussion that still presupposes that cars/trucks(/planes) (as we know them) still have a future is (mostly) a waste of time.

P.S.: The article mentions the "normal error-checking processes of society"... but what makes them so sure cults aren't part of them ?

It's not like society is particularly good about it either, immune from groupthink (see the issue above) - and who do you think is more likely to kick-start a strong enough alternative ?

(Or they are just sad about all the failures ? But it's questionable that the "process" can work (with all its vivacity) without the "failures"...)

human_person•5h ago
"I should do the job that makes the absolute most amount of money possible, like starting a crypto exchange, so that I can use my vast wealth in the most effective way."

Has always really bothered me because it assumes that there are no negative impacts of the work you did to get the money. If you do a million dollars worth of damage to the world and earn 100k (or a billion dollars worth of damage to earn a million dollars), even if you spend all of the money you earned on making the world a better place, you arent even going to fix 10% of the damage you caused (and thats ignoring the fact that its usually easier/cheaper to break things than to fix them).

to11mtm•4h ago
> If you do a million dollars worth of damage to the world and earn 100k (or a billion dollars worth of damage to earn a million dollars), even if you spend all of the money you earned on making the world a better place, you arent even going to fix 10% of the damage you caused (and thats ignoring the fact that its usually easier/cheaper to break things than to fix them).

You kinda summed up a lot of the world post industrial revolution there, at least as far as stuff like toxic waste (Superfund, anyone?) and stuff like climate change, I mean for goodness sake let's just think about TEL and how they knew Ethanol could work but it just wasn't 'patentable'. [0] Or the "We don't even know the dollar amount because we don't have a workable solution" problem of PFAS.

[0] - I still find it shameful that a university is named after the man who enabled this to happen.

abtinf•6h ago
> non-rationalists do at least benefit from some intellectual humility

The Islamists who took out the World Trade Center don’t strike me as particularly intellectually humble.

If you reject reason, you are only left with force.

morleytj•6h ago
I now feel the need to comment that this thread does illustrate an issue I have with the naming of the philosophical/internet community of rationalism.

One can very clearly be a rational individual or an individual who practices reason and not associate with the internet community of rationalism. The median member of the group defined as "not being part of the internet-organized movement of rationalism and not reading lesswrong posts" is not "religious extremist striking the world trade center and committing an atrocious act of terrorism", it's "random person on the street."

And to preempt a specific response some may make to this, yes, the thread here is talking about rationalism as discussed in the blog post above as organized around Yudowsky or slate star codex, and not the rationalist movement of like, Spinoza and company. Very different things philosophically.

prisenco•6h ago
Are you so sure the 9/11 hijackers rejected reason?

Why Are So Many Terrorists Engineers?

https://archive.is/XA4zb

Self-described rationalists can and often do rationalize acts and beliefs that seem baldly irrational to others.

cogman10•2h ago
Here's the thing, the goals of the terrorists weren't irrational.

People confuse "rational" with "moral". Those aren't the same thing. You can perfectly rationally do something that is immoral with a bad goal.

For example, if you value your life above all others, then it would be perfectly rational to slaughter an orphanage if a more powerful entity made that your only choice for survival. Morally bad, rationally correct.

montefischer•5h ago
Islamic fundamentalism and cult rationalism are both involved in a “total commitment”, “all or nothing” type of thinking. The former is totally committed to a particular literal reading of scripture, the latter, to logical deduction from a set of chosen premises. Both modes of thinking have produced violent outcomes in the past.

Skepticism, in which no premise or truth claim is regarded as above dispute (or, that it is always permissible and even praiseworthy to suspend one’s judgment on a matter), is the better comparison with rationalism-fundamentalism. It is interesting that skepticism today is often associated with agnostic or atheist religious beliefs, but I consider many religious thinkers in history to have been skeptics par excellence when judged by the standard of their own time. E.g. William Ockham (of Ockham’s razor) was a 14C Franciscan friar (and a fascinating figure) who denied papal infallibility. I count Martin Luther as belonging to the history of skepticism as well, for example, as well as much of the humanist movement that returned to the original Greek sources for the Bible, from the Latin Vulgate translation by Jerome.

The history of ideas is fun to read about. I am hardly an expert, but you may be interested by the history of Aristotelian rationalism, which gained prominence in the medieval west largely through the works of Averroes, a 12C Muslim philosopher who heavily favored Aristotle. In 13C, Thomas Aquinus wrote a definitive Catholic systematic theology, rejecting Averroes but embracing Aristotle. To this day, Catholic theology is still essentially Aristotelian.

throwway120385•4h ago
The only absolute above questioning is that there are no absolutes.
praptak•2h ago
True skepticism is rare. It's easy to be skeptical only about beliefs you dislike or at least don't care about. It's hard to approach the 100th self-professed psychic with an honest intention to truly test their claims rather than to find the easiest way to ridicule them.
tibbar•6h ago
Yet I think most people err in the other direction. They 'know' the basics of health, of discipline, of charity, but have a hard time following through. 'Take a simple idea, and take it seriously': a favorite aphorism of Charlie Munger. Most of the good things in my life have come from trying to follow through the real implications of a theoretical belief.
bearl•5h ago
And “always invert”! A related mungerism.
more_corn•4h ago
I always get weird looks when I talk about killing as many pilots as possible. I need a new example of the always invert model of problem solving.
godelski•6h ago

  > I don’t think it’s just (or even particularly) bad axioms
IME most people aren't very good at building axioms. I hear a lot of people say "from first principles" and it is a pretty good indication that they will not be. First principles require a lot of effort to create. They require iteration. They require a lot of nuance, care, and precision. And of course they do! They are the foundation of everything else that is about to come. This is why I find it so odd when people say "let's work from first principles" and then just state something matter of factly and follow from there. If you want to really do this you start simple, attack your own assumptions, reform, build, attack, and repeat.

This is how you reduce the leakiness, but I think it is categorically the same problem as the bad axioms. It is hard to challenge yourself and we often don't like being wrong. It is also really unfortunate that small mistakes can be a critical flaw. There's definitely an imbalance.

  >> The smartest people I have ever known have been profoundly unsure of their beliefs and what they know.
This is why the OP is seeing this behavior. Because the smartest people you'll meet are constantly challenging their own ideas. They know they are wrong to at least some degree. You'll sometimes find them talking with a bit of authority at first but a key part is watching how they deal with challenging of assumptions. Ask them what would cause them to change their minds. Ask them about nuances and details. They won't always dig into those can of worms but they will be aware of it and maybe nervousness or excited about going down that road (or do they just outright dismiss it?). They understand that accuracy is proportional to computation, and you have exponentially increasing computation as you converge on accuracy. These are strong indications since it'll suggest if they care more about the right answer or being right. You also don't have to be very smart to detect this.
joe_the_user•2h ago
IME most people aren't very good at building axioms.

It seems you implying that some people are good building good axiom systems for the real world. I disagree. There are a few situations in the world where you have generalities so close to complete that you can use simple logic on them. But for the messy parts of the real world, there simply is not set of logical claims which can provide anything like certainty no matter how "good" someone is at "axiom creation".

godelski•2h ago
I don't even know what you're arguing.

  > you implying that some people are good building good axiom systems
How do you go from "most people aren't very good" to "this implies some people are really good"? First, that is just a really weird interpretation of how people speak (btw, "you're" not "you" ;) because this is nicer and going to be received better than "making axioms is hard and people are shit at it." Second, you've assumed a binary condition. Here's an example. "Most people aren't very good at programming." This is an objectively true statement, right?[0] I'll also make the claim that no one is a good programmer, but some programmers are better than others. There's no contradiction in those two claims, even if you don't believe the latter is true.

Now, there are some pretty good axiom systems. ZF and ZFC seems to be working pretty well. There's others too and they are used to for pretty complex stuff. They all work at least for "simple logic."

But then again, you probably weren't thinking of things like ZFC. But hey, that was kinda my entire point.

  > there simply is not set of logical claims which can provide anything like certainty no matter how "good" someone is at "axiom creation".
 
I agree. I'd hope I agree considering my username... But you've jumped to a much stronger statement. I hope we both agree that just because there are things we can't prove that this doesn't mean there aren't things we can prove. Similarly I hope we agree that if we couldn't prove anything to absolute certainty that this doesn't mean we can't prove things to an incredibly high level of certainty or that we can't prove something is more right than something else.

[0] Most people don't even know how to write a program. Well... maybe everyone can write a Perl program but let's not get into semantics.

Dylan16807•57m ago
If you mean nobody is good at something, just say that.

Saying most people aren't good at it DOES imply that some are good at it.

dan_quixote•6h ago
As a former mechanical engineer, I visualize this phenomenon like a "tolerance stackup". Effectively meaning that for each part you add to the chain, you accumulate error. If you're not damn careful, your assembly of parts (or conclusions) will fail to measure up to expectations.
godelski•5h ago
I like this approach. Also having dipped my toes in the engineering world (professionally) I think it naturally follows that you should be constantly rechecking your designs. Those tolerances were fine to begin with, but are they now that things have changed? It also makes you think about failure modes. What can make this all come down and if it does what way will it fail? Which is really useful because you can then leverage this to design things to fail in certain ways and now you got a testable hypothesis. It won't create proof, but it at least helps in finding flaws.
to11mtm•5h ago
I like this analogy.

I think of a bike's shifting systems; better shifters, better housings, better derailleur, or better chainrings/cogs can each 'improve' things.

I suppose where that becomes relevant to here, is that you can have very fancy parts on various ends but if there's a piece in the middle that's wrong you're still gonna get shit results.

dylan604•4h ago
You only as strong as the weakest link.

Your SCSI devices are only as fast as the slowest device in the chain.

I don't need to be faster than the bear, I only have to be faster than you.

jandrese•1h ago
> Your SCSI devices are only as fast as the slowest device in the chain.

There are not many forums where you would see this analogy.

robocat•4h ago
I saw an article recently that talked about stringing likely inferences together but ending up with an unreliable outcome because enough 0.9 probabilities one after the other lead to an unlikely conclusion.

Edit: Couldn't find the article, but AI referenced Baysian "Chain of reasoning fallacy".

godelski•4h ago
I think you have this oversimplified. Stringing together inferences can take us in either direction. It really depends on how things are being done and this isn't always so obvious or simple. But just to show both directions I'll give two simple examples (real world holds many more complexities)

It is all about what is being modeled and how the inferences string together. If these are being multiplied, then yes, this is going to decreases as xy < x and xy < y for every x,y < 1.

But a good counter example is the classic Bayesian Inference example[0]. Suppose you have a test that detects vampirism with 95% accuracy (Pr(+|vampire) = 0.95) and has a false positive rate of 1% (Pr(+|mortal) = 0.01). But vampirism is rare, affecting only 0.1% of the population. This ends up meaning a positive test only gives us a 8.7% likelihood of a subject being a vampire (Pr(vampire|+). The solution here is that we repeat the testing. On our second test Pr(vampire) changes from 0.001 to 0.087 and Pr(vampire|+) goes to 89% and a third getting us to about 99%.

[0] Our equation is

                  Pr(+|vampire)Pr(vampire)
  Pr(vampire|+) = ------------------------
                           Pr(+)
And the crux is Pr(+) = Pr(+|vampire)Pr(vampire) + Pr(+|mortal)(1-Pr(vampire))
p1necone•3h ago
Worth noting that solution only works if the false positives are totally random, which is probably not true of many real world cases and would be pretty hard to work out.
godelski•3h ago
Definitely. Real world adds lots of complexities and nuances, but I was just trying to make the point that it matters how those inferences compound. That we can't just conclude that compounding inferences decreases likelihood
Dylan16807•32m ago
Well they were talking about a chain, A->B, B->C, C->D.

You're talking about multiple pieces of evidence for the same statement. Your tests don't depend on any of the previous tests also being right.

wombatpm•2h ago
Can’t you improve thing if you can calibrate with a known good vampire? You’d think NIST or the CDC would have one locked in a basement somewhere.
weard_beard•1h ago
GPT6 would come faster but we ran out of Casandra blood.
godelski•1h ago
IDK, probably? I'm just trying to say that iterative inference doesn't strictly mean decreasing likelihood.

I'm not a virologist or whoever designs these kinds of medical tests. I don't even know the right word to describe the profession lol. But the question is orthogonal to what's being discussed here. I'm only guessing "probably" because usually having a good example helps in experimental design. But then again, why wouldn't the original test that we're using have done that already? Wouldn't that be how you get that 95% accurate test?

I can't tell you the biology stuff, I can just answer math and ML stuff and even then only so much.

guerrilla•4h ago
This is what I hate about real life electronics. Everything is nice on paper, but physics sucks.
godelski•4h ago

  > Everything is nice on paper
I think the reason this is true is mostly because how people do things "on paper". We can get much more accurate with "on paper" modeling, but the amount of work increases very fast. So it tends to be much easier to just calculate things as if they are spherical chickens in a vacuum and account for error than it is to calculate including things like geometry, drag, resistance, and all that other fun jazz (which you still will also need to account for error/uncertainty though this now can be smaller).

Which I think at the end of the day the important lesson is more how simple explanations can be good approximations that get us most of the way there but the details and nuances shouldn't be so easily dismissed. With this framing we can choose how we pick our battles. Is it cheaper/easier/faster to run a very accurate sim or cheaper/easier/faster to iterate in physical space?

ctkhn•3h ago
Basically the same as how dead reckoning your location works worse the longer you've been traveling?
toasterlovin•2h ago
Dead reckoning is a great analogy for coming to conclusions based on reason alone. Always useful to check in with reality.
guerrilla•4h ago
> I don’t think it’s just (or even particularly) bad axioms, I think it’s that people tend to build up “logical” conclusions where they think each step is a watertight necessity that follows inevitably from its antecedents, but actually each step is a little bit leaky, leading to runaway growth in false confidence.

This is what you get when you naively re-invent philosophy from the ground up while ignoring literally 2500 years of actual debugging of such arguments by the smartest people who ever lived.

You can't diverge from and improve on what everyone else did AND be almost entirely ignorant of it, let alone have no training whatsoever in it. This extreme arrogance I would say is the root of the problem.

ar-nelson•8h ago
I find Yudowsky-style rationalists morbidly fascinating in the same way as Scientologists and other cults. Probably because they seem to genuinely believe they're living in a sci-fi story. I read a lot of their stuff, probably too much, even though I find it mostly ridiculous.

The biggest nonsense axiom I see in the AI-cult rationalist world is recursive self-improvement. It's the classic reason superintelligence takeoff happens in sci-fi: once AI reaches some threshold of intelligence, it's supposed to figure out how to edit its own mind, do that better and faster than humans, and exponentially leap into superintelligence. The entire "AI 2027" scenario is built on this assumption; it assumes that soon LLMs will gain the capability of assisting humans on AI research, and AI capabilities will explode from there.

But AI being capable of researching or improving itself is not obvious; there's so many assumptions built into it!

- What if "increasing intelligence", which is a very vague goal, has diminishing returns, making recursive self-improvement incredibly slow?

- Speaking of which, LLMs already seem to have hit a wall of diminishing returns; it seems unlikely they'll be able to assist cutting-edge AI research with anything other than boilerplate coding speed improvements.

- What if there are several paths to different kinds of intelligence with their own local maxima, in which the AI can easily get stuck after optimizing itself into the wrong type of intelligence?

- Once AI realizes it can edit itself to be more intelligent, it can also edit its own goals. Why wouldn't it wirehead itself? (short-circuit its reward pathway so it always feels like it's accomplished its goal)

Knowing Yudowsky I'm sure there's a long blog post somewhere where all of these are addressed with several million rambling words of theory, but I don't think any amount of doing philosophy in a vacuum without concrete evidence could convince me that fast-takeoff superintelligence is possible.

JKCalhoun•7h ago
An interesting point you make there — one would assume that if recursive self-improvement were a thing, Nature would have already lead humans into that "hall of mirrors".
marcosdumay•7h ago
Well, arguably that's exactly where we are, but machines can evolve faster.

And that's an entire new angle that the cultists are ignoring... because superintelligence may just not be very valuable.

And we don't need superintelligence for smart machines to be a problem anyway. We don't need even AGI. IMO, there's no reason to focus on that.

derefr•6h ago
> Well, arguably that's exactly where we are

Yep; from the perspective of evolution (and more specifically, those animal species that only gain capability generationally by evolutionary adaptation of instinct), humans are the recursively self-(fitness-)improving accident.

Our species-aggregate capacity to compete for resources within the biosphere went superlinear in the middle of the previous century; and we've had to actively hit the brakes on how much of everything we take since then, handicapping . (With things like epidemic obesity and global climate change being the result of us not hitting those brakes quite hard enough.)

Insofar as a "singularity" can be defined on a per-agent basis, as the moment when something begins to change too rapidly for the given agent to ever hope to catch up with / react to new conditions — and so the agent goes from being a "player at the table" to a passive observer of what's now unfolding around them... then, from the rest of our biosphere's perspective, they've 100% already witnessed the "human singularity."

No living thing on Earth besides humans now has any comprehension of how the world has been or will be reshaped by human activity; nor can ever hope to do anything to push back against such reshaping. Every living thing on Earth other than humans, will only survive into the human future, if we humans either decide that it should survive, and act to preserve it; or if we humans just ignore the thing, and then just-so-happen to never accidentally do anything to wipe it from existence without even noticing.

twic•6h ago
There's a variant of this that argues that humans are already as intelligent as it's possible to be. Because if it's possible to be more intelligent, why aren't we? And a slightly more reasonable variant that argues that we're already as intelligent as it's useful to be.
danaris•6h ago
While I'm deeply and fundamentally skeptical of the recursive self-improvement/singularity hypothesis, I also don't really buy this.

There are some pretty obvious ways we could improve human cognition if we had the ability to reliably edit or augment it. Better storage & recall. Lower distractibility. More working memory capacity. Hell, even extra hands for writing on more blackboards or putting up more conspiracy theory strings at a time!

I suppose it might be possible that, given the fundamental design and structure of the human brain, none of these things can be improved any further without catastrophic side effects—but since the only "designer" of its structure is evolution, I think that's extremely unlikely.

JKCalhoun•4h ago
Some of your suggestions, if you don't mind my saying, seem like only modest improvements — akin to Henry Ford's quote “If I had asked people what they wanted, they would have said a faster horse.”

To your point though, an electronic machine is a different host altogether with different strengths and weaknesses.

danaris•4h ago
Well, twic's comment didn't say anything about revolutionary improvements, just "maybe we're as smart as we can be".
lukan•5h ago
"Because if it's possible to be more intelligent, why aren't we?"

Because deep abstract thoughts about the nature of the universe and elaborate deep thinking were maybe not as useful while we were chasing lions and buffaloes with a spear?

We just had to be smarter then them. Which included finding out that tools were great. Learning about the habits of the prey and optmize hunting success. Those who were smarter in that capacity had a greater chance of reproducing. Those who just exceeded in thinking likely did not lived that long.

tshaddox•5h ago
Is it just dumb luck that we're able to create knowledge about black holes, quarks, and lots of things in between which presumably had zero evolutionary benefit before a handful of generations ago?
lukan•3h ago
Evolution rewarded us for developing general intelligence. But with a very immediate practical focus and not too much specialisation.
bee_rider•3h ago
Basically yes it is luck, in the sense that evolution is just randomness with a filter of death applied, so whatever brains we happen to have are just luck.

The brains we did end up with are really bad at creating that sort of knowledge. Almost none of us can. But we’re good at communicating, coming up with simplified models of things, and seeing how ideas interact.

We’re not universe-understanders, we’re behavior modelers and concept explainers.

godelski•5h ago
I don't think the logic follows here. Nor does it match evidence.

The premise is ignorant of time. It is also ignorant of the fact that we know there's a lot of things we don't know. That's all before we consider other factors like if there are limits and physical barriers or many other things.

Terr_•5h ago
I often like to point out that Earth was already consumed by Grey Goo, and today we are hive-minds in titanic mobile megastructure-swarms of trillions of the most complex nanobots in existence (that we know of), inheritors of tactics and capabilities from a zillion years of physical and algorithmic warfare.

As we imagine the ascension of AI/robots, it may seem like we're being humble about ourselves... But I think it's actually the reverse: It's a kind of hubris elevating our ability to create over the vast amount we've inherited.

tim333•7h ago
I've pondered recursive self-improvement. I'm fairly sure it will be a thing - we're at a point already where people could try telling Claude or some such to have a go, even if not quite at a point it would work. But I imagine take off would be very gradual. It would be constrained by available computing resources and probably only comparably good to current human researchers and so still take ages to get anywhere.
tempfile•6h ago
I honestly am not trying to be rude when I say this, but this is exactly the sort of speculation I find problematic and that I think most people in this thread are complaining about. Being able to tell Claude to have a go has no relation at all to whether it may ever succeed, and you don't actually address any of the legitimate concerns the comment you're replying to points out. There really isn't anything in this comment but vibes.
doubleunplussed•3h ago
On the other hand, I'm baffled to encounter recursive self-improvement being discussed as something not only weird to expect, but as damning evidence of sloppy thinking by those who speculate about it.

We have an existence proof for intelligence that can improve AI: humans.

If AI ever gets to human-level intelligence, it would be quite strange if it couldn't improve itself.

Are people really that sceptical that AI will get to human level intelligence?

It that an insane belief worthy of being a primary example of a community not thinking clearly?

Come on! There is a good chance AI will recursively self-improve! Those poo pooing this idea are the ones not thinking clearly.

tim333•3h ago
I don't think it's vibes rather than my thinking about the problem.

If you look at the "legitimate concerns" none are really deal breakers:

>What if "increasing intelligence", which is a very vague goal, has diminishing returns, making recursive self-improvement incredibly slow?

I'm will to believe it will be slow though maybe it won't

>LLMs already seem to have hit a wall of diminishing returns

Who cares - there will be other algorithms

>What if there are several paths to different kinds of intelligence with their own local maxima

well maybe, maybe not

>Once AI realizes it can edit itself to be more intelligent, it can also edit its own goals. Why wouldn't it wirehead itself?

well - you can make another one if the first does that

Those are all potential difficulties with self improvement, not reasons it will never happen. I'm happy to say it's not happening right now but do you have any solid arguments that it won't happen in the next century?

To me the arguments against sound like people in the 1800s discussing powered flight and saying it'll never happen because steam engine development has slowed.

PaulHoule•6h ago
Yeah, to compare Yudkowsky to Hubbard I've read accounts of people who read Dianetics or Science of Survival and thought "this is genius!" and I'm scratching my head and it's like they never read Freud or Horney or Beck or Berne or Burns or Rogers or Kohut, really any clinical psychology at all, even anything in the better 70% of pop psychology. Like Hubbard, Yudkowsky is unreadable, rambling [1] and inarticulate -- how anybody falls for it boggles my mind [2], but hey, people fell for Carlos Castenada who never used a word of the Yaqui language or mentioned any plant that grows in the desert in Mexico but has Don Juan give lectures about Kant's Critique of Pure Reason [3] that Castenada would have heard in school and you would have heard in school too if you went to school or would have read if you read a lot.

I can see how it appeals to people like Aella who wash into San Francisco without exposure to education [4] or philosophy or computer science or any topics germane to the content of Sequences -- not like it means you are stupid but, like Dianetics, Sequences wouldn't be appealing if you were at all well read. How is people at frickin' Oxford or Stanford fall for it is beyond me, however.

[1] some might even say a hypnotic communication pattern inspired by Milton Erickson

[2] you think people would dismiss Sequences because it's a frickin' Harry Potter fanfic, but I think it's like the 419 scam email which is riddled by typos which is meant to drive the critical thinker away and, ironically in the case of Sequences, keep the person who wants to cosplay as a critical thinker.

[3] minus any direct mention of Kant

[4] thus many of the marginalized, neurodivergent, transgender who left Bumfuck, AK because they couldn't live at home and went to San Francisco to escape persecution as opposed to seek opportunity

nemomarx•5h ago
I thought sequences was the blog posts and the fanfic was kept separately, to nitpick
ufmace•6h ago
I agree. There's also the point of hardware dependance.

From all we've seen, the practical ability of AI/LLMs seems to be strongly dependent on how much hardware you throw at it. Seems pretty reasonable to me - I'm skeptical that there's that much out there in gains from more clever code, algorithms, etc on the same amount of physical hardware. Maybe you can get 10% or 50% better or so, but I don't think you're going to get runaway exponential improvement on a static collection of hardware.

Maybe they could design better hardware themselves? Maybe, but then the process of improvement is still gated behind how fast we can physically build next-generation hardware, perfect the tools and techniques needed to make it, deploy with power and cooling and datalinks and all of that other tedious physical stuff.

morleytj•5h ago
The built in assumptions are always interesting to me, especially as it relates to intelligence. I find many of them (though not all), are organized around a series of fundamental beliefs that are very rarely challenged within these communities. I should initially mention that I don't think everyone in these communities believes these things, of course, but I think there's often a default set of assumptions going into conversations in these spaces that holds these axioms. These beliefs more or less seem to be as follows:

1) They believe that there exists a singular factor to intelligence in humans which largely explains capability in every domain (a super g factor, effectively).

2) They believe that this factor is innate, highly biologically regulated, and a static factor about a person(Someone who is high IQ in their minds must have been a high achieving child, must be very capable as an adult, these are the baseline assumptions). There is potentially belief that this can be shifted in certain directions, but broadly there is an assumption that you either have it or you don't, there is no feeling of it as something that could be taught or developed without pharmaceutical intervention or some other method.

3) There is also broadly a belief that this factor is at least fairly accurately measured by modern psychometric IQ tests and educational achievement, and that this factor is a continuous measurement with no bounds on it (You can always be smarter in some way, there is no max smartness in this worldview).

These are things that certainly could be true, and perhaps I haven't read enough into the supporting evidence for them but broadly I don't see enough evidence to have them as core axioms the way many people in the community do.

More to your point though, when you think of the world from those sorts of axioms above, you can see why an obsession would develop with the concept of a certain type of intelligence being recursively improving. A person who has become convinced of their moral placement within a societal hierarchy based on their innate intellectual capability has to grapple with the fact that there could be artificial systems which score higher on the IQ tests than them, and if those IQ tests are valid measurements of this super intelligence factor in their view, then it means that the artificial system has a higher "ranking" than them.

Additionally, in the mind of someone who has internalized these axioms, there is no vagueness about increasing intelligence! For them, intelligence is the animating factor behind all capability, it has a central place in their mind as who they are and the explanatory factor behind all outcomes. There is no real distinction between capability in one domain or another mentally in this model, there is just how powerful a given brain is. Having the singular factor of intelligence in this mental model means being able to solve more difficult problems, and lack of intelligence is the only barrier between those problems being solved vs unsolved. For example, there's a common belief among certain groups among the online tech world that all governmental issues would be solved if we just had enough "high-IQ people" in charge of things irrespective of their lack of domain expertise. I don't think this has been particularly well borne out by recent experiments, however. This also touches on what you mentioned in terms of an AI system potentially maximizing the "wrong types of intelligence", where there isn't a space in this worldview for a wrong type of intelligence.

godelski•5h ago

  > The biggest nonsense axiom I see in the AI-cult rationalist world is recursive self-improvement. 
This is also the weirdest thing and I don't think they even know the assumption they are making. It makes the assumption that there is infinite knowledge to be had. It also ignores the reality that in reality we have exceptionally strong indications that accuracy (truth, knowledge, whatever you want to call it) has exponential growth in complexity. These may be wrong assumptions, but we at least have evidence for them, and much more for the latter. So if objective truth exists, then that intelligence gap is very very different. One way they could be right there is for this to be an S-curve and for us humans to be at the very bottom there. That seems unlikely, though very possible. But they always treat this as linear or exponential as if our understanding to the AI will be like an ant trying to understand us.

The other weird assumption I hear is about how it'll just kill us all. The vast majority of smart people I know are very peaceful. They aren't even seeking power of wealth. They're too busy thinking about things and trying to figure everything out. They're much happier in front of a chalk board than sitting on a yacht. And humans ourselves are incredibly passionate towards other creatures. Maybe we learned this because coalitions are a incredibly powerful thing, but truth is that if I could talk to an ant I'd choose that over laying traps. Really that would be so much easier too! I'd even rather dig a small hole to get them started somewhere else than drive down to the store and do all that. A few shovels in the ground is less work and I'd ask them to not come back and tell others.

Granted, none of this is absolutely certain. It'd be naive to assume that we know! But it seems like these cults are operating on the premise that they do know and that these outcomes are certain. It seems to just be preying on fear and uncertainty. Hell, even Altman does this, ignoring risk and concern of existing systems by shifting focus to "an even greater risk" that he himself is working towards (You can't simultaneously maximize speed and safety). Which, weirdly enough might fulfill their own prophesies. The AI doesn't have to become sentient but if it is trained on lots of writings about how AI turns evil and destroys everyone then isn't that going to make a dumb AI that can't tell fact from fiction more likely to just do those things?

empiricus•4h ago
soo many things make no sense in this comment that I feel like 20% chance this a mid quality gpt. and so much interpolation effort, but starting from hearsay instead of primary sources. then the threads stop just before seeing the contradiction with the other threads. I imagine this is how we all reason most of the time, just based on vibes :(
godelski•2h ago
Sure, I wrote a lot and it's a bit scattered. You're welcome to point to something specific but so far you haven't. Ironically, you're committing the error you're accusing me of.

I'm also not exactly sure what you mean because the only claim I've made is that they've made assumptions where there are other possible, and likely, alternatives. It's much easier to prove something wrong than prove it right (or in our case, evidence, since no one is proving anything).

So the first part I'm saying we have to consider two scenarios. Either intelligence is bounded or unbounded. I think this is a fair assumption, do you disagree?

In an unbounded case, their scenario can happen. So I don't address that. But if you want me to, sure. It's because I have no reason to believe information is bounded when everything around me suggests that it is. Maybe start with the Bekenstein bound. Sure, it doesn't prove information is bounded but you'd then need to convince me that an entity not subject to our universe and our laws of physics is going to care about us and be malicious. Hell, that entity wouldn't even subject to time and we're still living.

In a bounded case it can happen but we need to understand what conditions that requires. There's a lot of functions but I went with S-curve for simplicity and familiarity. It'll serve fine (we're on HN man...) for any monotonically increasing case (or even non-monotonic, it just needs to tends that way).

So think about it. Change the function if you want, I don't care. But if intelligence is bounded, then if we're x more intelligent then ants, where on the graph do we need to be for another thing to be x more intelligent than us? There's not a lot of opportunities for that even to happen. It requires our intelligence (on that hypothetical scale) to be pretty similar than an ant. What cannot happen is that ant be in the tail of that function and us be further than the inflection point (half way). There just isn't enough space on that y-axis for anything to be x more intelligent. This doesn't completely reject that crazy superintelligence, but it does place some additional constraints that we can use to reason about things. For the "AI will be [human to ant difference] more intelligent than us" argument to follow it would require us to be pretty fucking dumb, and in that case we're pretty fucking dumb and it'd be silly to think we can make these types of predictions with reasonable accuracy (also true in the unbounded case!).

Yeah, I'll admit that this is a very naïve model but again, we're not trying to say what's right but instead just say there's good reason to believe their assumption is false. Adding more complexity to this model doesn't make their case stronger, it makes it weaker.

The second part I can make much easier to understand.

Yes, there's bad smart people, but look at the smartest people in history. Did they seek power or wish to harm? Most of the great scientists did not. A lot of them were actually quite poor and many even died fighting persecution.

So we can't conclude that greater intelligence results in greater malice. This isn't hearsay, I'm just saying Newton wasn't a homicidal maniac. I know, bold claim...

  > starting from hearsay
I don't think this word means what you think it means. Just because I didn't link sources doesn't make it a rumor. You can validate them and I gave you enough information to do so. You now have more. Ask gpt for links, I don't care, but people should stop worshiping Yud
jandrese•1h ago
I think of it more like visualizing a fractal on a computer. The more detail you try to dig down into the more detail you find, and pretty quickly you run out of precision in your model and the whole thing falls apart. Every layer further down you go the resource requirements increase by an exponential amount. That's why we have so many LLMs that seem beautiful at first glance but go to crap when the details really matter.
tshaddox•5h ago
> - What if there are several paths to different kinds of intelligence with their own local maxima, in which the AI can easily get stuck after optimizing itself into the wrong type of intelligence?

I think what's more plausible is that there is general intelligence, and humans have that, and it's general in the same sense that Turing machines are general, meaning that there is no "higher form" of intelligence that has strictly greater capability. Computation speed, memory capacity, etc. can obviously increase, but those are available to biological general intelligences just like they would be available to electronic general intelligences.

sfink•1h ago
> What if "increasing intelligence", which is a very vague goal, has diminishing returns, making recursive self-improvement incredibly slow?

This is sort of what I subscribe to as the main limiting factor, though I'd describe it differently. It's sort of like Amdahl's Law (and I imagine there's some sort of Named law that captures it, I just don't know the name): the magic AI wand may be very good at improving some part of AGI capability, but the more you improve that part, the more the other parts come to dominate. Metaphorically, even if the juice is worth the squeeze initially, pretty soon you'll only be left with a dried-out fruit clutched in your voraciously energy-consuming fist.

I'm actually skeptical that there's much juice in the first place; I'm sure today's AIs could generate lots of harebrained schemes for improvement very quickly, but exploring those possibilities is mind-numbingly expensive. Not to mention that the evaluation functions are unreliable, unknown, and non-monotonic.

Then again, even the current AIs have convinced a large number of humans to put a lot of effort into improving them, and I do believe that there are a lot of improvements that humans are capable of making to AI. So the human-AI system does appear to have some juice left. Where we'll be when that fruit is squeezed down to a damp husk, I have no idea.

ambicapter•8h ago
One of the only idioms that I don't mind living my life by is, "Follow the truth-seeker, but beware those who've found it".
JKCalhoun•7h ago
Interesting. I can't say I've done much following though — not that I am aware of anyway. Maybe I just had no leaders growing up.
lordnacho•8h ago
> I immediately become suspicious of anyone who is very certain of something

Me too, in almost every area of life. There's a reason it's called a conman: they are tricking your natural sense that confidence is connected to correctness.

But also, even when it isn't about conning you, how do people become certain of something? They ignored the evidence against whatever they are certain of.

People who actually know what they're talking about will always restrict the context and hedge their bets. Their explanation are tentative, filled with ifs and buts. They rarely say anything sweeping.

dcminter•7h ago
In the term "conman" the confidence in question is that of the mark, not the perpetrator.
sdwr•6h ago
Isn't confidence referring to the alternate definition of trust, as in "taking you into his confidence"?
godelski•6h ago
I think if you used that definition you could equally say "it is the mark that is taking the conman into [the mark's] confidence"
JKCalhoun•7h ago
You're describing the impressions I had of MENSA back in the 70's.
jpiburn•7h ago
"Cherish those who seek the truth but beware of those who find it" - Voltaire
paviva•6h ago
Most likely Gide ("Croyez ceux qui cherchent la vérité, doutez de ceux qui la trouvent", "Believe those who seek Truth, doubt those who find it") and not Voltaire ;)

Voltaire was generally more subtle: "un bon mot ne prouve rien", a witty saying proves nothing, as he'd say.

inasio•7h ago
Saw once a discussion that people should not have kids as it's by far the highest increase in your carbon footprint in your lifetime (>10x than going vegan, etc) be driven all the way to advocating genocide as a way of carbon footprint minimization
derektank•7h ago
Setting aside the reductio ad absurdum of genocide, this is an unfortunately common viewpoint. People really need to take into account the chances their child might wind up working on science or technology which reduces global CO2 emissions or even captures CO2. This reasoning can be applied to all sorts of naive "more people bad" arguments. I can't imagine where the world would be if Norman Borlaug's parents had decided to never have kids out of concern for global food insecurity.
freedomben•6h ago
It also entirely subjugates the economic realities that we (at least currently) live in to the future health of the planet. I care a great deal about the Earth and our environment, but the more I've learned about stuff the more I've realized that anyone advocating for focusing on one without considering the impact on the other is primarily following a religion
mapontosevenths•4h ago
> It also entirely subjugates the economic realities that we...

To play devils advocate, you could be seen as trying to subjugate the worlds health to your own economic well-being, and far fewer people are concerned with your tax bracket than there are people on earth. In a pure democracy, I'm fairly certain the planets well being would be deemed more important than the economy of whatever nation you live in.

> advocating for focusing on one... is primarily following a religion

Maybe, but they could also just be doing the risk calculus a bit differently. If you are a many step thinker the long term fecundity of our species might feel more important than any level of short term financial motivation.

freejazz•5h ago
Insane to call "more people bad" naive but then actually try and account for what would otherwise best be described as hope.
mapontosevenths•4h ago
> this is an unfortunately common viewpoint

Not everyone believes that the purpose of life is to make more life, or that having been born onto team human automatically qualifies team human as the best team. It's not necessarily unfortunate.

I am not a rationalist, but rationally that whole "the meaning of life is human fecundity" shtick is after school special tautological nonsense, and that seems to be the assumption buried in your statement. Try defining what you mean without causing yourself some sort of recursion headache.

> their child might wind up..

They might also grow up to be a normal human being, which is far more likely.

> if Norman Borlaug's parents had decided to never have kids

Again, this would only have mattered if you consider the well being of human beings to be the greatest possible good. Some people have other definitions, or are operating on much longer timescales.

Dylan16807•24m ago
> People really need to take into account the chances their child might wind up working on science or technology which reduces global CO2 emissions or even captures CO2.

All else equal, it would be better to spread those chances across a longer period of time at a lower population with lower carbon use.

throw0101a•6h ago
> Saw once a discussion that people should not have kids as it's by far the highest increase in your carbon footprint in your lifetime (>10x than going vegan, etc) be driven all the way to advocating genocide as a way of carbon footprint minimization

The opening scene of Utopia (UK) s2e6 goes over this:

> "Why did you have him then? Nothing uses carbon like a first-world human, yet you created one: why would you do that?"

* https://www.youtube.com/watch?v=rcx-nf3kH_M

uoaei•6h ago
This is why it's important to emphasize that rationality is not a good goal to have. Rationality is nothing more than applied logic, which takes axioms as given and deduces conclusions from there.

Reasoning is the appropriate target because it is a self-critical, self-correcting method that continually re-evaluates axioms and methods to express intentions.

amanaplanacanal•6h ago
It's very tempting to try to reason things through from first principles. I do it myself, a lot. It's one of the draws of libertarianism, which I've been drawn to for a long time.

But the world is way more complex than the models we used to derive those "first principles".

BobaFloutist•2h ago
It's also very fun and satisfying. But it should be limited to an intellectual exercise at best, and more likely a silly game. Because there's no true first principle, you always have to make some assumption along the way.
zaphar•6h ago
The distinction between them and religion is that religion is free to say that those axioms are a matter of faith and treat them as such. Rationalists are not as free to do so.
EGreg•5h ago
There are certain things I am sure of even though I derived them on my own.

But I constantly battle tested them against other smart people’s views, and just after I ran out of people to bring me new rational objections did I become sure.

Now I can battle test them against LLMs.

On a lesser level of confidence, I have also found a lot of times the people who disagreed with what I thought had to be the case, later came to regret it because their strategies ended up in failure and they told me they regretted not taking my recommendation. But that is on an individual level. I have gotten pretty good at seeing systemic problems, architecting systemic solutions, and realizing what it would take to get them adopted to at least a critical mass. Usually, they fly in the face of what happens normally in society. People don’t see how their strategies and lives are shaped by the technology and social norms around them.

Here, I will share three examples:

Public Health: https://www.laweekly.com/restoring-healthy-communities/

Economic and Governmental: https://magarshak.com/blog/?p=362

Wars & Destruction: https://magarshak.com/blog/?p=424

For that last one, I am often proven somewhat wrong by right-wing war hawks, because my left-leaning anti-war stance is about avoiding inflicting large scale misery on populations, but the war hawks go through with it anyway and wind up defeating their geopolitical enemies and gaining ground as the conflict fades into history.

projektfu•4h ago
"genetically engineers high fructose corn syrup into everything"

This phrase is nonsense, because HFCS is a chemical process applied to normal corn after the harvest. The corn may be a GMO but it certainly doesn't have to be.

EGreg•4h ago
Agreed, that was phrased wrong. The fruits across the board have been genetically engineered to be extremely sweet (fructose, not the syrup): https://weather.com/news/news/2018-10-03-fruit-so-sweet-zoo-...

While their nutritional quality has gone down tremendously, for vegetables too: https://pmc.ncbi.nlm.nih.gov/articles/PMC10969708/

gen220•5h ago
Strongly recommend this profile in the NYer on Curtis Yarvin (who also uses "rationalism" to justify their beliefs) [0]. The section towards the end that reports on his meeting one of his supposed ideological heroes for an extended period of time is particularly illuminating.

I feel like the internet has led to an explosion of these such groups because it abstracts the "ideas" away from the "people". I suspect if most people were in a room or spent an extended amount of time around any of these self-professed, hyper-online rationalists, they would immediately disregard any theories they were able to cook up, no matter how clever or persuasively-argued they might be in their written down form.

[0]: https://www.newyorker.com/magazine/2025/06/09/curtis-yarvin-...

trawy081225•3h ago
> I feel like the internet has led to an explosion of these such groups because it abstracts the "ideas" away from the "people". I suspect if most people were in a room or spent an extended amount of time around any of these self-professed, hyper-online rationalists, they would immediately disregard any theories they were able to cook up, no matter how clever or persuasively-argued they might be in their written down form.

Likely the opposite. The internet has led to people being able to see the man behind the curtain, and realize how flawed the individuals pushing these ideas are. Whereas many intellectuals from 50 years back were just as bad if not worse, but able to maintain a false aura of intelligence by cutting themselves off from the masses.

wussboy•1h ago
Hard disagree. People use rationality to support the beliefs they already have, not to change those beliefs. The internet allows everyone to find something that supports anything.

I do it. You do it. I think a fascinating litmus test is asking yourself this question: “When did I last change my mind about something significant?” For most people the answer is “never”. If we lived in the world you described, most people’s answers would be “relatively recently”.

ratelimitsteve•5h ago
Are you familiar with ship of theseus as an arugmentation fallacy? Innuendo Studios did a great video on it and I think that a lot of what you're talking about breaks down to this. Tldr - it's a fallacy of substitution, small details of an argument get replaced by things that are (or feel like) logical equivalents until you end up saying something entirely different but are arguing as though you said the original thing. In this video the example is "senator doxxes a political opponent" but on looking "senator" turns out to mean "a contractor working for the senator" and "doxxes a political opponent" turns out to mean "liked a tweet that had that opponent's name in it in a way that could draw attention to it".

Each change is arguably equivalent and it seems logical that if x = y then you could put y anywhere you have x, but after all of the changes are applied the argument that emerges is definitely different from the one before all the substitutions are made. It feels like communities that pride themselves on being extra rational seem subject to this because it has all the trappings of rationalism but enables squishy, feely arguments

GeoAtreides•5h ago
Epistemological skepticism sure is a belief. A strong belief on your side?

I am profoundly sure, I am certain I exist and that a reality outside myself exists. Worse, I strongly believe knowing this external reality is possible, desirable and accurate.

How suspicious does that make me?

antisthenes•4h ago
It's crazy to read this, because by writing what you wrote you basically show that you don't understand what an axiom is.

You need to review the definition of the word.

> The smartest people I have ever known have been profoundly unsure of their beliefs and what they know.

The smartest people are unsure about their higher level beliefs, but I can assure you that they almost certainly don't re-evaluate "axioms" as you put it on a daily or weekly basis. Not that it matters, as we almost certainly can't verify who these people are based on an internet comment.

> I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

That's only your problem, not anyone else's. If you think people can't arrive to a tangible and useful approximation of truth, then you are simply delusional.

mapontosevenths•4h ago
> If you think people can't arrive to a tangible and useful approximation of truth, then you are simply delusional

Logic is only a map, not the territory. It is a new toy, still bright and shining from the box in terms of human history. Before logic there were other ways of thinking, and new ones will come after. Yet, Voltaire's bastards are always certain they're right, despite being right far less often than they believe.

Can people arrive at tangible and useful conclusions? Certainly, but they can only ever find capital "T" Truth in a very limited sense. Logic, like many other models of the universe, is only useful until you change your frame of reference or the scale at which you think. Then those laws suddenly become only approximations, or even irrelevant.

antisthenes•10m ago
There is no (T)ruth, but there is a useful approximation of truth for 99.9% things that I want to do in life.

YMMV.

JohnMakin•2h ago
> It's crazy to read this, because by writing what you wrote you basically show that you don't understand what an axiom is. You need to review the definition of the word.

Oh, do enlighten then.

> The smartest people are unsure about their higher level beliefs, but I can assure you that they almost certainly don't re-evaluate "axioms" as you put it on a daily or weekly basis. Not that it matters, as we almost certainly can't verify who these people are based on an internet comment.

I'm not sure you are responding to the right comment, or are severely misinterpreting what I said. Clearly a nerve was struck though, and I do apologize for any undue distress. I promise you'll recover from it.

antisthenes•11m ago
> Oh, do enlighten then.

Absolutely. Just in case your keyboard wasn't working to arrive at this link via Google.

https://www.merriam-webster.com/dictionary/axiom

First definition, just in case it still isn't obvious.

> I'm not sure you are responding to the right comment, or are severely misinterpreting what I said. Clearly a nerve was struck though, and I do apologize for any undue distress.

Someone was wrong on the Internet! Just don't want other people getting the wrong idea. Good fun regardless.

SLWW•4h ago
A logical argument is only as good as it's presuppositions. To first lay siege to your own assumptions before reasoning about them tends towards a more beneficial outcome.

Another issue with "thinkers" is that many are cowards; whether they realize it or not a lot of presuppositions are built on a "safe" framework, placing little to no responsibility on the thinker.

> The smartest people I have ever known have been profoundly unsure of their beliefs and what they know. I immediately become suspicious of anyone who is very certain of something, especially if they derived it on their own.

This is where I depart from you. If I say it's anti-intellectual I would only be partially correct, but it's worse than that imo. You might be coming across "smart people" who claim to know nothing "for sure", which in itself is a self-defeating argument. How can you claim that nothing is truly knowable as if you truly know that nothing is knowable? I'm taking these claims to their logical extremes btw, avoiding the granular argumentation surrounding the different shades and levels of doubt; I know that leaves vulnerabilities in my argument, but why argue with those who know that they can't know much of anything as if they know what they are talking about to begin with? They are so defeatist in their own thoughts, it's comical. You say, "profoundly unsure", which reads similarly to me as "can't really ever know" which is a sure truth claim, not a relative claim or a comparative as many would say, which is a sad attempt to side-step the absolute reality of their statement.

I know that I exist, regardless of how I get here I know that I do, there is a ridiculous amount of rhetoric surrounding that claim that I will not argue for here, this is my presupposition. So with that I make an ontological claim, a truth claim, concerning my existence; this claim is one that I must be sure of to operate at any base level. I also believe I am me and not you, or any other. Therefore I believe in one absolute, that "I am me". As such I can claim that an absolute exists, and if absolutes exist, then within the right framework you must also be an absolute to me, and so on and so forth; what I do not see in nature is an existence, or notion of, the relative on it's own as at every relative comparison there is an absolute holding up the comparison. One simple example is heat. Hot is relative, yet it also is objective; some heat can burn you, other heat can burn you over a very long time, some heat will never burn. When something is "too hot" that is a comparative claim, stating that there is another "hot" which is just "hot" or not "hot enough", the absolute still remains which is heat. Relativistic thought is a game of comparisons and relations, not making absolute claims; the only absolute claim is that there is no absolute claim to the relativist. The reason I am talking about relativists is that they are the logical, or illogical, conclusion of the extremes of doubt/disbelief i previously mentioned.

If you know nothing you are not wise, you are lazy and ill-prepared, we know the earth is round, we know that gravity exists, we are aware of the atomic, we are aware of our existence, we are aware that the sun shines it's light upon us, we are sure of many things that took debate among smart people many many years ago to arrive to these sure conclusions. There was a time where many things we accept where "not known" but were observed with enough time and effort by brilliant people. That's why we have scientists, teachers, philosophers and journalists. I encourage you that the next time you find a "smart" person who is unsure of their beliefs, you should kindly encourage them to be less lazy and challenge their absolutes, if they deny the absolute could be found then you aren't dealing with a "smart" person, you are dealing with a useful idiot who spent too much time watching skeptics blather on about meaningless topics until their brains eventually fell out. In every relative claim there must be an absolute or it fails to function in any logical framework. You can with enough thought, good data, and enough time to let things steep find the (or an) absolute and make a sure claim. You might be proven wrong later, but that should be an indicator to you that you should improve (or a warning you are being taken advantage of by a sophist), and that the truth is out there, not to sequester yourself away in this comfortable, unsure hell that many live in till they die.

The beauty of absolute truth is that you can believe absolutes without understanding the entirety of the absolute. I know gravity exists but I don't know fully how it works. Yet I can be absolutely certain it acts upon me, even if I only understand a part of it. People should know what they know and study it until they do and not make sure claims outside of what they do not know until they have the prerequisite absolute claims to support the broader claims with the surety of the weakest of their presuppositions.

Apologies for grammar, length and how schizo my thought process appears; I don't think linearly and it takes a goofy amount of effort to try to collate my thoughts in a sensible manner.

positron26•3h ago
Any theory of everything will often have a little perpetual motion machine at the nexus. These can be fascinating to the mind.

Pressing through uncertainty either requires a healthy appetite for risk or an engine of delusion. A person who struggles to get out of their comfort zone will seek enablement through such a device.

Appreciation of risk-reward will throttle trips into the unknown. A person using a crutch to justify everything will careen hyperbolically into more chaotic and erratic behaviors hoping to find that the device is still working, seeking the thrill of enablement again.

The extremism comes from where once the user learned to say hello to a stranger, their comfort zone has expanded to an area that their experience with risk-reward is underdeveloped. They don't look at the external world to appreciate what might happen. They try to morph situations into some confirmation of the crutch and the inferiority of confounding ideas.

"No, the world isn't right. They are just weak and the unspoken rules [in the user's mind] are meant to benefit them." This should always resonate because nobody will stand up for you like you have a responsibility to.

A study of uncertainty and the limitations of axioms, the inability of any sufficiently expressive formalism to be both complete and consistent, these are the ideas that are antidotes to such things. We do have to leave the rails from time to time, but where we arrive will be another set of rails and will look and behave like rails, so a bit of uncertainty is necessary, but it's not some magic hat that never runs out of rabbits.

Another psychology that will come into play from those who have left their comfort zone is the inability to revert. It is a harmful tendency to presume all humans fixed quantities. Once a behavior exists, the person is said to be revealed, not changed. The proper response is to set boundaries and be ready to tie off the garbage bag and move on if someone shows remorse and desire to revert or transform. Otherwise every relationship only gets worse. If instead you can never go back, extreme behavior is a ratchet. Ever mistake becomes the person.

Animats•3h ago
Many arguments arise over the valuation of future money. See "discount function" [1] At one extreme are the rational altruists, who rate that near 1.0, and the "drill, baby, drill" people, who are much closer to 0.

The discount function really should have a noise term, because predictions about the future are noisy, and the noise increases with the distance into the future. If you don't consider that, you solve the wrong problem. There's a classic Roman concern about running out of space for cemeteries. Running out of energy, or overpopulation, turned out to be problems where the projections assumed less noise than actually happened.

[1] https://en.wikipedia.org/wiki/Discount_function

mensetmanusman•2h ago
Another issue with these groups is that they often turn into sex cults.
UltraSane•1h ago
A good example of this is the number of huge assumptions needed for the argument for Roko's basilisk. I'm shocked that some people actually take it seriously.
animal_spirits•10h ago
> If someone is in a group that is heading towards dysfunctionality, try to maintain your relationship with them; don’t attack them or make them defend the group. Let them have normal conversations with you.

This is such an important skill we should all have. I learned this best from watching the documentary Behind the Curve, about flat earthers, and have applied it to my best friend diving into the Tartarian conspiracy theory.

dkarl•10h ago
Isn't this entirely to be expected? The people who dominate groups like these are the ones who put the most time and effort into them, and no sane person who appreciates both the value and the limitations of rational thinking is going to see as much value in a rationalist group, and devote as much time to it, as the kind of people who are attracted to the cultish aspect of achieving truth and power through pure thought. There's way more value there if you're looking to indulge in, or exploit, a cult-like spiral into shared fantasy than if you're just looking to sharpen your logical reasoning.
gadders•10h ago
They are literally the "ackchyually" meme made flesh.
Isamu•9h ago
So I like Steven Pinker’s book Rationality, to me it seems quite straightforward.

But I have never been able to get into the Rationalist stuff, to me it’s all very meandering and peripheral and focused on… I don’t know what.

Is it just me?

ameliaquining•9h ago
Depends very much on what you're hoping to get out of it. There isn't really one "rationalist" thing at this point, it's now a whole bunch of adjacent social groups with overlapping-but-distinct goals and interests.
handoflixue•9h ago
https://www.lesswrong.com/highlights this is the ostensible "Core Highlights", curated by major members of the community, and I believe Eliezer would endorse it.

If you don't get anything out of reading the list itself, then you're probably not going to get anything out of the rest of the community either.

If you poke around and find a few neat ideas there, you'll probably find a few other neat ideas.

For some people, though, this is "wait, holy shit, you can just DO that? And it WORKS?", in which case probably read all of this but then also go find a few other sources to counter-balance it.

(In particular, probably 90% of the useful insights already exist elsewhere in philosophy, and often more rigorously discussed - LessWrong will teach you the skeleton, the general sense of "what rationality can do", but you need to go elsewhere if you want to actually build up the muscles)

biophysboy•9h ago
> “There’s this belief [among rationalists],” she said, “that society has these really bad behaviors, like developing self-improving AI, or that mainstream epistemology is really bad–not just religion, but also normal ‘trust-the-experts’ science. That can lead to the idea that we should figure it out ourselves. And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.”

I see this arrogant attitude all the time on HN: reflexive distrust of the "mainstream media" and "scientific experts". Critical thinking is a very healthy idea, but its dangerous when people use it as a license to categorically reject sources. Its even worse when extremely powerful people do this; they can reduce an enormous sub-network of thought into a single node for many many people.

So, my answer for "Why Are There So Many Rationalist Cults?" is the same reason all cults exist: humans like to feel like they're in on the secret. We like to be in secret clubs.

ameliaquining•9h ago
Sure, but that doesn't say anything about why one particular social scene would spawn a bunch of cults while others do not, which is the question that the article is trying to answer.
biophysboy•9h ago
Maybe I was too vague. My argument is that cults need a secret. The secret of the rationalist community is "nobody is rational except for us". Then the rituals would be endless probability/math/logic arguments about sci-fi futures.
saalweachter•7h ago
I think the promise of secret knowledge is important, but I think cults also need a second thing: "That thing you fear? You're right to fear it, and only we can protect you from it. If you don't do what we say, it's going to be so much worse than it is now, but if you do, everything will be good and perfect."

In the rationalist cults, you typically have the fear of death and non-existence, coupled with the promise of AGI, the Singularity and immortality, weighed against the AI Apocalypse.

biophysboy•6h ago
I guess I'd say protection promises like this are a form of "secret knowledge". At the same time, so many cults have this protection racket so you might be on to something
the_third_wave•9h ago
Gott ist tot! Gott bleibt tot! Und wir haben ihn getötet! Wie trösten wir uns, die Mörder aller Mörder? Das Heiligste und Mächtigste, was die Welt bisher besaß, es ist unter unseren Messern verblutet.

The average teenager who reads Nietzsches proclamation on the death of God thinks of this as an accomplishment, finally we got rid of those thousands of years old and thereby severely outdated ideas and rules. Somewhere along the march to maturity they may start to wonder whether that which has replaced those old rules and ideas were good replacements but most of them never come to the realisation that there were rebellious teenagers during all those centuries when the idea of a supreme being to which or whom even the mightiest were to answer to still held sway. Nietzsche saw the peril in letting go off that cultural safety valve and warned for what might come next.

We are currently living in the world he warned us about and for that I, atheist as I am, am partly responsible. The question to be answered here is whether it is possible to regain the benefits of the old order without getting back the obvious excesses, the abuse, the sanctimoniousness and all the other abuses of power and privilege which were responsible for turning people away from that path.

digbybk•9h ago
When I was looking for a group in my area to meditate with, it was tough finding one that didn't appear to be a cult. And yet I think Buddhist meditation is the best tool for personal growth humanity has ever devised. Maybe the proliferation of cults is a sign that Yudkowsky was on to something.
ivm•8h ago
None of them are practicing Buddhist meditation though, same for the "personal growth" oriented meditation styles.

Buddhist meditation exists only in the context of the Four Noble Truths and the rest of the Buddha's Dhamma. Throwing them away means it stops being Buddhist.

digbybk•7h ago
I disagree, but we'd be arguing semantics. In any case, the point still stands: you can just as easily argue that these rationalist offshoots aren't really Rationalist.
ivm•3h ago
I'm not familiar enough with their definitions to argue about them, but meditations techniques predate Buddhism. In fact, the Buddha himself learned them from two teachers before developing his own path. Also, the style of meditation taught nowadays (accepting non-reactive awareness) is not how it's described in the Pali Canon.

This isn’t just a "must come from the Champagne region of France, otherwise it’s sparkling wine" bickering, but actual widespread misconceptions of what counts as Buddhism. Many ideas floating in Western discourse are basically German Romanticism wrapped in Orientalist packaging, not matching neither Theravada nor Mahayana teachings (for example, see the Fake Buddha Quotes project).

So the semantics are extremely important when it comes to spiritual matters. Flip one or two words and the whole metaphysical model goes in a completely different direction. Even translations add distortions, so there’s no room to be careless.

os2warpman•9h ago
Rationalists are, to a man (and they’re almost all men) arrogant dickheads and arrogant dickheads do not see what they’re doing to be “a cult” but “the right and proper way of things because I am right and logical and rational and everyone else isn’t”.
IX-103•4h ago
That's an unnecessary charicaterature. I have met many rationalists of both genders and found most of them quite pleasant. But it seems that the proportion of "arrogant dickheads" unfortunately matches that of the general population. Whether it's "irrational people" or "liberal elites" these assholes always seem to find someone to look down on.
jancsika•9h ago
> And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.

It's mostly just people who aren't very experienced talking about and dealing honestly with their emotions, no?

I mean, suppose someone is busy achieving and, at the same time, proficient in balancing work with emotional life, dealing head-on with interpersonal conflicts, facing change, feeling and acknowledging hurt, knowing their emotional hangups, perhaps seeing a therapist, perhaps even occasionally putting personal needs ahead of career... :)

Tell that person they can get a marginal (or even substantial) improvement from some rationalist cult practice. Their first question is going to be, "What's the catch?" Because at the very least they'll suspect that adjusting their work/life balance will bring a sizeable amount of stress and consequent decrease in their emotional well-being. And if the pitch is that this rationalist practice works equally well at improving emotional well-being, that smells to them. They already know they didn't logic themselves into their current set of emotional issues, and they are highly unlikely to logic themselves out of them. So there's not much value here to offset the creepy vibes of the pitch. (And again-- being in touch with your emotions means quicker and deeper awareness of creepy vibes!)

Now, take a person whose unexplored emotional well-being tacitly depends on achievement. Even a marginal improvement in achievement could bring perceptible positive changes in their holistic selves! And you can step through a well-specified, logical process to achieve change? Sign HN up!

scythe•8h ago
One of the hallmarks of cults — if not a necessary element — is that they tend to separate their members from the outside society. Rationalism doesn't directly encourage this, but it does facilitate it in a couple of ways:

- Idiosyncratic language used to describe ordinary things ("lightcone" instead of "future", "prior" instead of "belief" or "prejudice", etc)

- Disdain for competing belief systems

- Insistence on a certain shared interpretation of things most people don't care about (the "many-worlds interpretation" of quantum uncertainty, self-improving artificial intelligence, veganism, etc)

- I'm pretty sure polyamory makes the list somehow, just because it isn't how the vast majority of people want to date. In principle it's a private lifestyle choice, but it's obviously a community value here.

So this creates an opportunity for cult-like dynamics to occur where people adjust themselves according to their interactions within the community but not interactions outside the community. And this could seem — to the members — like the beliefs themselves are the problem, but from a sociological perspective, it might really be the inflexible way they diverge from mainstream society.

kazinator•7h ago
The only way you can hope to get a gathering of nothing but paragons of critical thinking and skepticism is if the gathering has an entrance exam in critical thinking and skepticism (and a pretty tough one, if they are to be paragons). Or else, it's invitation-only.
VonGuard•7h ago
This is actually a known pattern in tech, going back to Engelbart and SRI. While not 1-to-1, you could say that the folks who left SRI for Xerox PARC did so because Engelbart and his crew became obsessed with EST: https://en.wikipedia.org/wiki/Erhard_Seminars_Training

EST-type training still exists today. You don't eat until the end of the whole weekend, or maybe you get rice and little else. Everyone is told to insult you day one until you cry. Then day two, still having not eaten, they build you up and tell you how great you are and have a group hug. Then they ask you how great you feel. Isn't this a good feeling? Don't you want your loved ones to have this feeling? Still having not eaten, you're then encouraged to pay for your family and friends to do the training, without their knowledge or consent.

A friend of mine did this training after his brother paid for his mom to do it, and she paid for him to do it. Let's just say that, though they felt it changed their lives at the time, their lives in no way shape or form changed. Two are in quite a bad place, in fact...

Anyway, point is, the people who invented everything we are using right now were also susceptible to cult-like groups with silly ideas and shady intentions.

nobody9999•3h ago
>EST-type training still exists today

It's called the "Landmark"[0] now.

Several of my family members got sucked into that back in the early 80s and quite a few folks I knew socially as well.

I was quite skeptical, especially because of the cult-like fanaticism of its adherents. They would go on for as long as you'd let them (often needing to just walk away to get them to stop) try to get you to join.

The goal appears to be to obtain as much legal tender as can be pried from those who are willing to part with it. Hard sell, abusive and deceptive tactics are encouraged -- because it's so important for those who haven't "gotten it" to do so, justifying just about anything. But if you don't pay -- you get bupkis.

It's a scam, and an abusive one at that.

[0] https://en.wikipedia.org/wiki/Landmark_Worldwide

ziknard•2h ago
There is a words for people who go to EST: EST-holes.
Mizza•7h ago
It's amphetamine. All of these people are constantly tweaking. They're annoying people to begin with, but they're all constantly yakked up and won't stop babbling. It's really obvious, I don't know why it isn't highlighted more in all these post Ziz articles.
Muromec•7h ago
How do you know?
ajkjk•6h ago
Presumably they mean Adderall. Plausible theory tbh. Although it's just a factor not an explanation.
tbrake•6h ago
having known dozens of friends, family, roommates, coworkers etc both before and after they started them. The two biggest telltale signs -

1. tendency to produce - out of no necessity whatsoever, mind - walls of text. walls of speech will happen too but not everyone rambles.

2. Obnoxiously confident that they're fundamentally correct about whatever position they happen to be holding during a conversation with you. No matter how subjective or inconsequential. Even if they end up changing it an hour later. Challenging them on it gets you more of #1.

Muromec•6h ago
I mean, I know the effects of adderall/ritalin and it's plausible, what I'm asking is whether if gp knows that for a fact or deduces from what is known.
MinimalAction•5h ago
Pretty much spot on! It is frustrating to talk with these when they never admit they are wrong. They find new levels of abstractions to deal with your simpler counterarguments and it is a never ending deal unless you admit they were right.
TheAceOfHearts•4h ago
Many people like to write in order to develop and explore their understanding of a topic. Writing lets you spend a lot of time playing around with whatever idea you're trying to understand, and sharing this writing invites others to challenge your assumptions.

When you're uncertain about a topic, you can explore it by writing a lot about said topic. Ideally, when you've finished exploring and studying a topic, you should be able to write a much more condensed / synthesized version.

Henchman21•4h ago
I call this “diarrhea of the mind”. It’s what happens when you hear a steady stream of bullshit from someone’s mouth. It definitely tracks with substance abuse of “uppers”, aka meth, blow, hell even caffeine!
kridsdale3•2h ago
https://en.wikipedia.org/wiki/Logorrhea_(psychology)
buggy6257•2h ago
well now i finally have a good blog name for the blog i'll never start.
throwanem•5h ago
Who's writing them?
samdoesnothing•4h ago
Yeah it's pretty obvious and not surprising. What do people expect when a bunch of socially inept nerds with weird unchallenged world views start doing uppers? lol
kridsdale3•2h ago
I like to characterize the culture of each (roughly) decade with the most popular drugs of the time. It really gives you a new lens for media and culture generation.
JKCalhoun•7h ago
> Many of them also expect that, without heroic effort, AGI development will lead to human extinction.

Odd to me. Not biological warfare? Global warming? All-out nuclear war?

I guess The Terminator was a formative experience for them. (For me perhaps it was The Andromeda Strain.)

mitthrowaway2•7h ago
These aren't mutually exclusive. Even in The Terminator, Skynet's method of choice is nuclear war. Yudkowsky frequency expressses concern that a malevolent AI might synthesize a bioweapon. I personally worry that destroying the ozone layer might be an easy opening volley. Either way, I don't want a really smart computer spending its time figuring out plans to end the human species, because I think there are too many ways to be successful.
staticman2•1h ago
Terminator descends from a tradition of science fiction cold war parables. Even in Terminator 2 there's a line suggesting the movie isn't really about robots:

John:We're not gonna make it, are we? People, I mean.

Terminator: It's in your nature to destroy yourselves.

Seems odd to worry about computers shooting the ozone when there's plenty of real existential threats loaded in missles aimed at you right now.

mitthrowaway2•1h ago
I'm not in any way discounting the danger represented by those missiles. In fact I think AI only makes it more likely that they might someday be launched. But I will say that in my experience the error-condition that causes a system to fail is usually the one that didn't seem likely to happen, because the more obvious failure modes were taken seriously from the beginning. Is it so unusual to be able to consider more than one risk at a time?
myaccountonhn•6h ago
That's what was so strange with EA and rationalist movements. A highly theoretical model that AGI could wipe us all out vs the very real issue of global warming and pretty much all emphasis was on AGI.
wredcoll•1h ago
Agi is a lot more fun to worry about and asks a lot less of you. Sort of like advocating for the "unborn" vs veterans/homeless/addicts.
eschaton•6h ago
It makes a lot of sense when you realize that for many of the “leaders” in this community like Yudkowsky, their understanding of science (what it is, how it works, and its potential) comes entirely from reading science fiction and playing video games.

Sad because Eli’s dad was actually a real and well-credentialed researcher at Bell Labs. Too bad he let his son quit school at an early age to be an autodidact.

bell-cot•4h ago
My interpretation: When they say "will lead to human extinction", they are trying to vocalize their existential terror that an AGI would render them and their fellow rationalist cultists permanently irrelevant - by being obviously superior to them, by the only metric that really matters to them.
DonsDiscountGas•4h ago
Check out "the precipice" by Tony Ord. Biological warfare and global warming are unlikely to lead to total human extinction (though both present large risks of massive harm).

Part of the argument is that we've had nuclear weapons for a long time but no apocalypse so the annual risk can't be larger than 1%, whereas we've never created AI so it might be substantially larger. Not a rock solid argument obviously, but we're dealing with a lot of unknowns.

A better argument is that most of those other risks are not neglected, plenty of smart people working against nuclear war. Whereas (up until a few years ago) very few people considered AI a real threat, so the marginal benefit of a new person working on it should be bigger.

meowface•2h ago
Most in the community consider nuclear and biological threats to be dire. Many just consider existential threats from AI to be even more probable and damaging.
skybrian•1h ago
Yes, sufficiently high intelligence is sometimes assumed to allow for rapid advances in many scientific areas. So, it could be biological warfare because AGI. Or nanotech, drone warfare, or something stranger.

I'm a little skeptical (there may be bottlenecks that can't be solved by thinking harder), but I don't see how it can be ruled out.

1970-01-01•7h ago
I find it ironic that the question is asked unempirically. Where is the data stating there are many more than before? Start there, then go down the rabbit hole. Otherwise, you're concluding on something that may not be true, and trying to rationalize the answer, just as a cultist does.
arduanika•5h ago
Oh come on.

Anyone who's ever seen the sky knows it's blue. Anyone who's spent much time around rationalism knows the premise of this article is real. It would make zero sense to ban talking about about a serious and obvious problem in their community until some double blind peer reviewed data can be gathered.

It would be what they call an "isolated demand for rigor".

akomtu•7h ago
It's a religion of an overdeveloped mind that hides from everything it cannot understand. It's an anti-religion, in a sense, that puts your mind on the pedestal.

Note the common pattern in major religions: they tell you that thoughts and emotions obscure the light of intuition, like clouds obscure sunlight. Rationalism is the opposite: it denies the very idea of intuition, or anything above the sphere of thoughts, and tells to create as many thoughts as possible.

Rationalists deny anything spiritual, good or evil, because they don't have evidence to think otherwise. They remain in this state of neutral nihilism until someone bigger than them sneaks into their ranks and casually introduces them to evil with some undeniable evidence. Their minds quickly pass the denial-anger-acceptance stages and being faithful to their rationalist doctrine they update their beliefs with what they know. From that point they are a cult. That's the story of Scientology, which has too many many parallels with Rationalism.

skrebbel•7h ago
This article is beautifully written, and it's full of proper original research. I'm sad that most comments so far are knee-jerk "lol rationalists" type responses. I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.
mm263•6h ago
[flagged]
teh_klev•2h ago
I have a link for you:

https://news.ycombinator.com/newsguidelines.html

Scroll to the bottom of the page.

knallfrosch•4h ago
I think it's perfectly fine to read these articles, think "definitely a cult" and ignore whether they believe in spaceships, or demons, or AGI.

The key takeaway from the article is that if you have a group leader who cuts you off from other people, that's a red flag – not really a novel, or unique, or situational insight.

andrewflnr•54m ago
That's a side point of the article, acknowledged as an old idea. The central points of this article are actually quite a bit more interesting than that. He even summarized his conclusions concisely at the end, so I don't know what your excuse is for trivializing it.
meowface•2h ago
Asterisk is basically "rationalist magazine" and the author is a well-known rationalist blogger, so it's not a surprise that this is basically the only fair look into this phenomenon - compared to the typical outside view that rationalism itself is a cult and Eliezer Yudkowsky is a cult leader, both of which I consider absurd notions.
lyu07282•54m ago
> I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.

I once called rationalists infantile, impotent liberal escapism, perhaps that's the novel take you are looking for.

Essentially my view is that the fundamental problem with rationalists and the effective altruist movement is that they are talking about profound social and political issues, with any and all politics completely and totally removed from it. It is liberal depoliticisation[1] driven to its ultimate conclusion. That's just why they are ineffective and wrong about everything, but that's also why they are popular among the tech elites that are giving millions to associated groups like MIRI[2]. They aren't going away, they are politically useful and convenient to very powerful people.

[1] https://en.wikipedia.org/wiki/Post-politics

[2] https://intelligence.org/transparency/

aidenn0•29m ago
https://en.wikipedia.org/wiki/They_Saved_Lisa%27s_Brain
SpaceManNabs•7h ago
Cause they all read gwern and all eugenics leads into cults because conspiracy adjacent garbo always does.
dav_Oz•7h ago
For me largley shaped by the westering old Europe creaking and breaking (after 2 WWs) under its heavy load of philosophical/metaphysical inheritance (which at this point in time can be considered effectively americanized).

It is still fascinating to trace back the divergent developments like american-flavoured christian sects or philosophical schools of "pragmatism", "rationalism" etc. which get super-charged by technological disruptions.

In my youth I was heavily influenced by the so-called Bildung which can be functionally thought of as a form of ersatz religion and is maybe better exemplified in the literary tradition of the Bildungsroman.

I've grappled with and wildly fantasized about all sorts of things, experimented mindlessly with all kinds of modes of thinking and consciousness amidst my coming-of-age, in hindsight without this particular frame of Bildung left by myself I would have been left utterly confused and maybe at some point acted out on it. By engaging with books like Der Zauberberg by Thomas Mann or Der Mann ohne Eigenschaften by Robert Musil, my apparent madness was calmed down and instead of breaking the dam of a forming social front of myself with the vastness of the unconsciousness, over time I was guided to develop my own way into slowly operating it appropriately without completely blowing myself up into a messiah or finding myself eternally trapped in the futility and hopelessness of existence.

Borrowing from my background, one effective vaccination which spontaneously came up in my mind against rationalists sects described here, is Schopenhauer's Die Welt als Wille und Vorstellung which can be read as a radical continuation of Kant's Critique of Pure Reason which was trying to stress test the ratio itself. [To demonstrate the breadth of Bildung in even something like the physical sciences e.g. Einstein was familiar with Kant's a priori framework of space and time, Heisenberg's autobiographical book Der Teil und das Ganze was motivated by: "I wanted to show that science is done by people, and the most wonderful ideas come from dialog".]

Schopenhauer arrives at the realization because of the groundwork done by Kant (which he heavily acknowledges): that there can't even exist a rational basis for rationality itself, that it is simply an exquisitely disguised tool in the service of the more fundamental will i.e. by its definition an irrational force.

Funny little thought experiment but what consequences does this have? Well, if you are declaring the ratio as your ultima ratio you are just fooling yourself in order to be able to rationalize anything you want. Once internalized Schopenhauer's insight gets you overwhelmed by Mitleid for every conscious being, inoculating you against the excesses of your own ratio. It instantly hit me with the same force as MDMA but several years before.

jameslk•7h ago
Over rationalizing is paperclip maximizing
idontwantthis•7h ago
Does anyone else feel that “rationality” is the same as clinical anxiety?

I’m hyper rational when I don’t take my meds. I’m also insane. But all of my thoughts and actions follow a carefully thought out sequence.

rogerkirkness•7h ago
Because they have serious emotional maturity issues leading to lobotomizing their normal human emotional side of their identity and experience of life.
AndrewKemendo•6h ago
I was on LW when it emerged from the OB blog and back then it was a interesting and engaging group, though even then there were like 5 “major” contributors - most of which had no coherent academic or commercial success.

As soon as those “sequences” were being developed it was clearly turning into a cult around EY, that I never understood and still don’t.

This article did a good job of covering the history since and was really well written.

Water finds its own level

thedudeabides5•6h ago
Purity Spirals + Cheap Talk = irrational rationalists
thedudeabides5•6h ago
Eliezer Yudkowsky, shows little interest in running one. He has consistently been distant from and uninvolved in rationalist community-building efforts, from Benton House (the first rationalist group house) to today’s Lightcone Infrastructure (which hosts LessWrong, an online forum, and Lighthaven, a conference center). He surrounds himself with people who disagree with him, discourages social isolation.

Ummm, EY literally has a semi-permanent office in Lighthouse (at least until recently) and routinely blocks people on Twitter as a matter of course.

throw0101a•6h ago
> Purity Spirals

This is an interesting idea (phenomenon?):

> A purity spiral is a theory which argues for the existence of a form of groupthink in which it becomes more beneficial to hold certain views than to not hold them, and more extreme views are rewarded while expressing doubt, nuance, or moderation is punished (a process sometimes called "moral outbidding").[1] It is argued that this feedback loop leads to members competing to demonstrate the zealotry or purity of their views.[2][3]

* https://en.wikipedia.org/wiki/Purity_spiral

gwbas1c•6h ago
Many years ago I met Eliezer Yudkowsky. He handed me a pamphlet extolling the virtues of rationality. The whole thing came across as a joke, as a parody of evangelizing. We both laughed.

I glanced at it once or twice and shoved it into a bookshelf. I wish I kept it, because I never thought so much would happen around him.

yubblegum•3h ago
imo These people are promoted. You look at their backgrounds and there is nothing that justifies their perches. Eliezer Yudkowsky is (iirc) a Thiel baby, isn't he?
quickthrowman•3h ago
I only know Eliezer Yudkowsky from his Harry Potter fanfiction, most notably Harry Potter and the Methods of Rationality.

Is he known publicly for some other reason?

meowface•2h ago
He's considered the father of rationalism and the father of AI doomerism. He wrote this famous article in Time magazine a few years ago: https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-no...

His book If Anyone Builds It, Everyone Dies comes out in a month: https://www.amazon.com/Anyone-Builds-Everyone-Dies-Superhuma...

You can find more info here: https://en.wikipedia.org/wiki/Eliezer_Yudkowsky

wredcoll•2h ago
> He's considered the father of rationalism

[citation needed]

Even for this weird cult that is trying to appropriate the word, would they really consider him the father of redefining the word?

meowface•1h ago
The article you're reading is from the unofficial rationalist magazine and the author is a prominent rationalist blogger, so they (and I) obviously don't consider it a cult. But, yes, Yudkowsky is absolutely considered the founder of the modern rationalism movement. (No relation to the philosophical tradition also called "rationalism". Modern rationalism is mostly actually just empiricism.)
skybrian•1h ago
Less metaphorically, he was a prolific, influential blogger. His early blog posts are collectively known as "the Sequences" and when people asked what rationalism is about, they were told to read those.

So the community itself gives him a lot of credit.

jefftk•1h ago
I think that claim would be pretty uncontroversial among people who consider themselves rationalists. He was extremely influential initially, and his writing kicked off the community.
Liftyee•6h ago
Finally, something that properly articulates my unease when encountering so-called "rationalists" (especially the ones that talk about being "agentic", etc.). For some reason, even though I like logical reasoning, they always rubbed me the wrong way - probably just a clash between their behavior and my personal values (mainly humility).
psunavy03•6h ago
A problem with this whole mindset is that humans, all of us, are only quasi-rational beings. We all use System 1 ("The Elephant") and System 2 ("The Rider") thinking instinctively. So if you end up in deep denial about your own capacity for irrationality, I guess it stands to reason you could end up getting led down some deep dark rabbit holes.
Muromec•6h ago
Wasn't the "fast&slow" thingy debunked as another piece of popscience?
psunavy03•5h ago
The point remains. People are not 100 percent rational beings, never have been, never will be, and it's dangerous to assume that this could ever be the case. Just like any number of failed utopian political movements in history that assumed people could ultimately be molded and perfected.
simpaticoder•4h ago
Those of us who accept this limitation can often fail to grasp how much others perceive it as a profound attack on the self. To me, it is a basic humility - that no matter how much I learn, I cannot really transcend the time and place of my birth, the biology of my body, the quirks of my culture. Rationality, though, promises that transcendence, at least to some people. And look at all the trouble such delusion has caused, for example "presentism". Science fiction often introduces a hidden coordinate system, one of language and predicate, upon which reason can operate, but system itself did not come from reason, but rather a storyteller's aesthetic.
navane•4h ago
I think duality gets debunked every couple of hundred years
lmm•1h ago
No?
aaronbaugher•5h ago
Some of the most irrational people I've met were those who claimed to make all their decisions rationally, based on facts and logic. They're just very good at rationalizing, and since they've pre-defined their beliefs as rational, they never have to examine where else they might come from. The rest of us at least have a chance of thinking, "Wait, am I fooling myself here?"
lupusreal•55m ago
Yup. It's fundamentally irrational for anybody to believe themselves sufficiently rational to pull off the feats of supposed rational deduction that the so called Rationalists regularly perform. Predicting the future of humanity decades or even centuries away is absurd, but the Rationalists irrationally believe they can.

So to the point of the article, rationalist cults are common because Rationalists are irrational people (like all people) who (unlike most people) are blinded to their own irrationality by their overinflated egos. They can "reason" themselves into all manner of convoluted pretzels and lack the humility to admit they went off the deep end.

vehemenz•6h ago
I get the impression that these people desperately want to study philosophy but for some reason can't be bothered to get formal training because it would be too humbling for them. I call it "small fishbowl syndrome," but maybe there's a better term for it.
1attice•4h ago
My thoughts exactly! I'm a survivor of ten years in the academic philosophy trenches and it just sounds to me like what would happen if you left a planeload of undergraduates on a _Survivor_ island with an infinite supply of pizza pockets and adderall
joeblubaugh•1h ago
Funny that this also describes these cult rationalist groups very well.
username332211•4h ago
The reason why people can't be bothered to get formal training is that modern philosophy doesn't seem that useful.

It was a while ago, but take the infamous story of the 2006 rape case in Duke University. If you check out coverage of that case, you get the impression every member of faculty that joined in the hysteria was from some humanities department, including philosophy. And quite a few of them refused to change their mind even as the prosecuting attorney was being charged with misconduct. Compare that to Socrates' behavior during the trial of the admirals in 406 BC.

Meanwhile, whatever meager resistence was faced by that group seems to have come from economists, natural scientist or legal scholars.

I wouldn't blame people for refusing to study in a humanities department where they can't tell right from wrong.

freejazz•4h ago
>The reason why people can't be bothered to get formal training is that modern philosophy doesn't seem that useful.

But rationalism is?

NoMoreNicksLeft•4h ago
Yeh, probably.

Imagine that you're living in a big scary world, and there's someone there telling you that being scared isn't particularly useful, that if you slow down and think about the things happening to you, most of your worries will become tractable and some will even disappear. It probably works at first. Then they sic Roko's Basilisk on you, and you're a gibbering lunatic 2 weeks later...

freejazz•2h ago
Hah
username332211•4h ago
Nature abhors a vaccum. After the October revolution, the genuine study of humanities was extinguished in Russia and replaced with the mindless repetition of rather inane doctrines. But people with awakened and open minds would always ask questions and seek answers.

Those would, of course, be people with no formal training in history or philosophy (as the study of history where you aren't allowed to question Marxist doctrine would be self-evidently useless). Their training would be in the natural sciences or mathematics. And without knowing how to properly reason about history or philosophy, they may reach fairly kooky conclusions.

Hence why Rationalism can be though as the same class of phenomena as Fomenko's chronology (or if you want to be slightly more generous, Shafarevich's philosophical tracts).

lmm•1h ago
Well, maybe. It seems at least adjacent to the stuff that's been making a lot of people rich lately.
samdoesnothing•1h ago
I think the argument is that philosophy hasn't advanced much in the last 1000 years, but it''s still 10,000 years ahead of whatever is coming out of the rationalist camp.
fellowniusmonk•3h ago
Philosophy is interesting in how it informs computer science and vice-versa.

Mereological nihilism and weak emergence is interesting and helps protect against many forms of kind of obsessive levels of type and functional cargo culting.

But then in some areas philosophy is woefully behind, and you have philosophers poo-pooing intuitionism when any software engineer working on sufficiently federated or real world sensor/control system borrows constructivism into their classical language to not kill people (agda is interesting of course). Intermediate logic is clearly empirically true.

It's interesting that people don't understand the non-physicality of the abstract and you have people serving the abstract instead of the abstract being used to serve people. People confusing the map for the terrain is such a deeply insidious issue.

I mean all the lightcone stuff, like, you can't predict ex ante what agents will be keystones in beneficial casual chains so its such waste of energy to spin your wheels on.

djeastm•3h ago
Modern philosophy isn't useful because some philosophy faculty at Duke were wrong about a rape case? Is that the argument being made here?
qcnguy•2h ago
Which group of people giving modern training in philosophy should we judge the field by? If they can't use it correctly in such a basic case then who can?
staticman2•2h ago
Did the Duke philosophy teachers claim they were using philosophy to determine if someone was raped?

And did all the philosophers at all the other colleges convene and announce they were also using philosophy to determine if someone was raped?

Dylan16807•8m ago
> Did the Duke philosophy teachers claim they were using philosophy to determine if someone was raped?

I don't think that matters very much. If there's a strong enough correlation between being a reactive idiot and the department you're in, it makes a bad case for enrolling in that realm of study for educational motives. It's especially bad when the realm of study is directly focused on knowledge, ethics, and logic.

Note the "if" though, I haven't evaluated the parent's claims. I'm just saying it doesn't matter if they said they used philosophy. It reflects on philosophy as a study, at least the style they do there.

How much that affects other colleges is iffier, but it's not zero.

username332211•2h ago
No. The fact that they were wrong is almost irrelevant.

The faculty denounced the students without evidence, judged the case thought their emotions and their preconceived notions and refused to change their minds as new evidence emerged. Imagine having an academic discussion on a difficult ethical issue with such a teacher...

And none of that would have changed even, even if there somehow was a rape-focused conspiracy among the students of that university. (Thought the problem would have been significantly less obvious.)

wredcoll•2h ago
> Meanwhile, whatever meager resistence was faced by that group seems to have come from economists, natural scientist or legal scholars.

> I wouldn't blame people for refusing to study in a humanities department where they can't tell right from wrong.

Man, if you have to make stuff up to try to convince people... you might not be on the right side here.

username332211•1h ago
I'm not sure what you are talking about. I have to admit, I mostly wrote my comment based on my recollections and it's a case 20 years ago I barely paid attention to until after the bizzaire conclusion. But looking trough Wikipedia's articles on the case[1] it doesn't seem I'm that far from the truth.

I guess I should have limited my statement about resisting mob justice to the economists at that university as the other departments merely didn't sign on to the public letter of denunciation?

Its weird that Wikipedia doesn't give you a percentage of signatories of the letter of 88 from the philosophy department, but several of the notable signatories are philosophers.

[1] https://en.m.wikipedia.org/wiki/Reactions_to_the_Duke_lacros...

Edit: Just found some articles claiming that a chemistry professor by the name of Stephen Baldwin was the first to write to the university newspaper condemning the mob.

samdoesnothing•4h ago
Why would they need formal training? Can't they just read Plato, Socrates, etc, and classical lit like Dostoevsky, Camus, Kafka etc? That would be far better than whatever they're doing now.
guerrilla•4h ago
I'm someone who has read all of that and much more, including intense study of SEP and some contemporary papers and textbooks, and I would say that I am absolutely not qualified to produce philosophy of the quality output by analytic philosophy over the last century. I can understand a lot of it, and yes, this is better than being completely ignorant of the last 2500 years of philosophy as most rationalists seem to be, but doing only what I have done would not sufficiently prepare them to work on the projects that they want to work on. They (and I) do not have the proper training in logic or research methods, let alone the experience that comes from guided research in the field as it is today. What we all lack especially is the epistemological reinforcement that comes from being checked by a community of our peers. I'm not saying it can't be done alone, I'm just saying that what you're suggesting isn't enough and I can tell you because I'm quite beyond that and I know that I cannot produce the quality of work that you'll find in SEP today.
samdoesnothing•1h ago
Oh I don't mean to imply reading some classical lit prepares you for a career producing novel works in philosophy, simply that if one wants to understand themselves, others, and the world better they don't need to go to university to do it. They can just read.
giraffe_lady•3h ago
This is like saying someone who wants to build a specialized computer for a novel use should read the turing paper and get to it. A lot has of development has happened in the field in the last couple hundred years.
samdoesnothing•1h ago
I don't think that is similar at all. People want to understand the world better, they don't want to learn how to build it from first principles.
sien•2h ago
Trying to do a bit of formal philosophy at University is really worth doing.

You realise that it's very hard to do well and it's intellectual quicksand.

Reading philosophers and great writers as you suggest is better than joining a cult.

It's just that you also want to write about what you're thinking in response to reading such people and ideally have what you write critiqued by smart people. Perhaps an AI could do some of that these days.

dragonwriter•1h ago
> It's just that you also want to write about what you're thinking in response to reading such people and ideally have what you write critiqued by smart people. Perhaps an AI could do some of that these days.

An AI can neither write about what you are thinking in your place nor substitute for a particularly smart critic, but might still be useful for rubber-ducking philosophical writing if used well.

sien•1h ago
Errrf. That was poor writing on my part.

I meant use the AI to critique what you have written in response to reading the suggested authors.

Yes, a particularly smart critic would be better. But an LLM is easily available.

kayodelycaon•59m ago
I took a few philosophy classes. I found it incredibly valuable in identifying assumptions and testing them.

Being Christian, it helped me understand what I believe and why. It made faith a deliberate, reasoned choice.

And, of course, there are many rational reasons for people to have very different opinions when it comes to religion and deities.

Being bipolar might give me an interesting perspective. Everything I’ve read about rationalists misses the grounding required to isolate emotion as a variable.

caycep•6h ago
Granted, admitted from what little I've read on the outside, the "rational" part just seems to be mostly the writing style - this sort of dispassionate, eloquently worded prose that makes weird ideas seem more "rational" and logical than they really are.
knallfrosch•4h ago
Yes, they're not rational at all. They're just a San-Francisco/Bay area cult who use that word.
mordnis•5h ago
I really like your suggestions, even for non-rationalists.
Atlas667•5h ago
Narcissism and Elitism justified by material wealth.

What else?

Rationalism isn't any more "correct" and "proper" thinking than Christianity and Buddhism claim to espouse.

jmull•5h ago
I think rationalist cults work exactly the same as religious cults. They promise to have all the answers, to attract the vulnerable. The answers are convoluted and inscrutable, so a leader/prophet interprets them. And doom is neigh, providing motivation and fear to hold things together.

It's the same wolf in another sheep's clothing.

And people who wouldn't join a religious cult -- e.g. because religious cults are too easy to recognize since we're all familiar with them, or because religions hate anything unusual about gender -- can join a rationalist cult instead.

mathattack•5h ago
On a recent Mindscape podcast Sean Carroll mentioned that rationalists are rational about everything except accusations that they're not being rational.
doubleunplussed•3h ago
I mean you have to admit that that's a bit of a kafkatrap
Jtsummers•5h ago
> Many of them also expect that, without heroic effort, AGI development will lead to human extinction.

> These beliefs can make it difficult to care about much of anything else: what good is it to be a nurse or a notary or a novelist, if humanity is about to go extinct?

Replace AGI causing extinction with the Rapture and you get a lot of US Christian fundamentalists. They often reject addressing problems in the environment, economy, society, etc. because the Rapture will happen any moment now. Some people just end up stuck in a belief about something catastrophic (in the case of the Rapture, catastrophic for those left behind but not those raptured) and they can't get it out of their head. For individuals who've dealt with anxiety disorder, catastrophizing is something you learn to deal with (and hopefully stop doing), but these folks find a community that reinforces the belief about the pending catastrophe(s) and so they never get out of the doom loop.

tines•5h ago
The Rapture isn't doom for the people who believe in it though (except in the lost sense of the word), whereas the AI Apocalypse is, so I'd put it in a different category. And even in that category, I'd say that's a pretty small number of Christians, fundamentalist or no, who abandon earthly occupations for that reason.
Jtsummers•4h ago
Yes, I removed a parenthetical "(or euphoria loop for the Rapture believers who know they'll be saved)". But I removed it because not all who believe in the Rapture believe they will be saved (or have such high confidence) and, for them, it is a doom loop.

Both communities, though, end up reinforcing the belief amongst their members and tend towards increasing isolation from the rest of the world (leading to cultish behavior, if not forming a cult in the conventional sense), and a disregard for the here and now in favor of focusing on this impending world changing (destroying or saving) event.

JohnMakin•4h ago
I don't mean to well ackshually you here, but there are several different theological beliefs around the Rapture, some of which believe Christians will remain during the theoretical "end times." The megachurch/cinema version of this very much believes they won't, but, this is not the only view, either in modern times or historically. Some believe it's already happened, even. It's a very good analogy.
taberiand•4h ago
Replace AGI with Climate Change and you've got an entirely reasonable set of beliefs.
NoMoreNicksLeft•4h ago
You have a very popular set of beliefs.
ImaCake•4h ago
You can treat climate change as your personal Ragnarok, but its also possible to take a more sober view that climate change is just bad without it being apocalyptic.
psunavy03•3h ago
You can believe climate change is a serious problem without believing it is necessarily an extinction-level event. It is entirely possible that in the worst case, the human race will just continue into a world which sucks more than it necessarily has to, with less quality of life and maybe lifespan.
taberiand•2h ago
I never said I held the belief, just that it's reasonable
lupusreal•1h ago
A set of beliefs which causes somebody to waste their life in misery, because they think doom is imminent and everything is therefore pointless, is never a reasonable set of beliefs to hold. Whatever the weight of the empirical evidence behind the belief, it would be plainly unreasonable to accept that belief if accepting it condemns you to a wasted life.
taurath•3h ago
Raised to huddle close and expect the imminent utter demise of the earth and being dragged to the depths of hell if I so much as said a bad word I heard on TV, I have to keep an extremely tight handle on my anxiety in this day and age.

It’s not from a rational basis, but from being bombarded with fear from every rectangle in my house, and the houses of my entire community

joe_the_user•2h ago
A lot of people also believe that global warming will cause terrible problems. I think that's a plausible belief but if you combine people believing one or another of these things, you've a lot of the US.

Which is to say that I don't think just dooming is going on. Especially, the belief in AGI doom has a lot of plausible arguments in its favor. I happen not to believe in it but as a belief system it is more similar to a belief in global warming than to a belief in the raptures.

pavlov•4h ago
A very interesting read.

My idea of these self-proclaimed rationalists was fifteen years out of date. I thought they’re people who write wordy fan fiction, but turns out they’ve reached the point of having subgroups that kill people and exorcise demons.

This must be how people who had read one Hubbard pulp novel in the 1950s felt decades later when they find out he’s running a full-blown religion now.

The article seems to try very hard to find something positive to say about these groups, and comes up with:

“Rationalists came to correct views about the COVID-19 pandemic while many others were saying masks didn’t work and only hypochondriacs worried about covid; rationalists were some of the first people to warn about the threat of artificial intelligence.”

There’s nothing very unique about agreeing with the WHO, or thinking that building Skynet might be bad… (The rationalist Moses/Hubbard was 12 when that movie came out — the most impressionable age.) In the wider picture painted by the article, these presumed successes sound more like a case of a stopped clock being right twice a day.

qcnguy•2h ago
Yeah that paragraph was really sad, where I stopped reading even. Both of those beliefs are dead wrong and they are the best examples the author could find to defend this delusional and dangerous belief system.

The "threat of AI" they're claiming validates rationalism doesn't exist. These loons were the reason Google sat on their LLMs and made their image models only draw pictures of robots, because of the supposed "threat" of AI. Now everyone can run models way better on their own laptops and the sky hasn't fallen, there hasn't even been mass unemployment or anything. Not even the weakest version of this belief has proven true. AI is very friendly, even.

And masks? How many graphs of cases/day with mask mandate transitions overlayed are required before people realize masks did nothing? Whole countries went from nearly nobody wearing them, to everyone wearing them, overnight, and COVID cases/day didn't even notice. You can't look at a case graph and see where the rules changed. Which makes sense because SARS-CoV-2 is aerosolized and can enter through the masks, around the masks, when masks are removed and even through the eyeballs.

Seems like rationalists in the end have managed to be correct about nothing. What a disappointment.

skybrian•1h ago
It was genuinely difficult to persuade people to wear masks before everyone started doing it and it became normal.
lexandstuff•1h ago
The point of wearing a mask is to protect other people from your respiratory droplets. Please wear a mask when you're sick.
skybrian•1h ago
The WHO didn't declare a global pandemic until March 11, 2020 [1]. That's a little slow and some rationalists were earlier than that. (Other people too.)

After reading a warning from a rationalist blog, I posted a lot about COVID news to another forum and others there gave me credit for giving the heads-up that it was a Big Deal and not just another thing in the news. (Not sure it made all that much difference, though?)

[1] https://pmc.ncbi.nlm.nih.gov/articles/PMC7569573/

xpe•4h ago
> The Sequences make certain implicit promises. ...

Some meta-commentary first... How would one go about testing if this is true? If true, then such "promises" are not written down -- they are implied. So one would need to ask at least two questions: 1. Did the author intend to make these implicit promises? 2. What portion of readers perceive them as such?

> ... There is an art of thinking better ...

First, this isn't _implicit_ in the Sequences; it is stated directly. In any case, the quote does not constitute a promise: so far, it is a claim. And yes, rationalists do think there are better and worse ways of thinking, in the sense of "what are more effective ways of thinking that will help me accomplish my goals?"

> ..., and we’ve figured it out.

Codswallop. This is not a message of the rationality movement -- quite the opposite. We share what we've learned and why we believe it to be true, but we don't claim we've figured it all out. It is better to remain curious.

> If you learn it, you can solve all your problems...

Bollocks. This is not claimed implicitly or explicitly. Besides, some problems are intractable.

> ... become brilliant and hardworking and successful and happy ...

Rubbish.

> ..., and be one of the small elite shaping not only society but the entire future of humanity.

Bunk.

For those who haven't read it, I'll offer a relevant extended quote from Yudkowsky's 2009 "Go Forth and Create the Art!" [1], the last post of the Sequences:

## Excerpt from Go Forth and Create the Art

But those small pieces of rationality that I've set out... I hope... just maybe...

I suspect—you could even call it a guess—that there is a barrier to getting started, in this matter of rationality. Where by default, in the beginning, you don't have enough to build on. Indeed so little that you don't have a clue that more exists, that there is an Art to be found. And if you do begin to sense that more is possible—then you may just instantaneously go wrong. As David Stove observes—I'm not going to link it, because it deserves its own post—most "great thinkers" in philosophy, e.g. Hegel, are properly objects of pity. That's what happens by default to anyone who sets out to develop the art of thinking; they develop fake answers.

When you try to develop part of the human art of thinking... then you are doing something not too dissimilar to what I was doing over in Artificial Intelligence. You will be tempted by fake explanations of the mind, fake accounts of causality, mysterious holy words, and the amazing idea that solves everything.

It's not that the particular, epistemic, fake-detecting methods that I use, are so good for every particular problem; but they seem like they might be helpful for discriminating good and bad systems of thinking.

I hope that someone who learns the part of the Art that I've set down here, will not instantaneously and automatically go wrong, if they start asking themselves, "How should people think, in order to solve new problem X that I'm working on?" They will not immediately run away; they will not just make stuff up at random; they may be moved to consult the literature in experimental psychology; they will not automatically go into an affective death spiral around their Brilliant Idea; they will have some idea of what distinguishes a fake explanation from a real one. They will get a saving throw.

It's this sort of barrier, perhaps, which prevents people from beginning to develop an art of rationality, if they are not already rational.

And so instead they... go off and invent Freudian psychoanalysis. Or a new religion. Or something. That's what happens by default, when people start thinking about thinking.

I hope that the part of the Art I have set down, as incomplete as it may be, can surpass that preliminary barrier—give people a base to build on; give them an idea that an Art exists, and somewhat of how it ought to be developed; and give them at least a saving throw before they instantaneously go astray.

That's my dream—that this highly specialized-seeming art of answering confused questions, may be some of what is needed, in the very beginning, to go and complete the rest.

[1]: https://www.lesswrong.com/posts/aFEsqd6ofwnkNqaXo/go-forth-a...

IX-103•4h ago
Is it really that surprising that a group of humans who think they have some special understanding of reality compared to others tend to separate and isolate themselves until they fall into an unguided self-reinforcing cycle?

I'd have thought that would be obvious since it's the history of many religions (which seem to just be cults that survived the bottleneck effect to grow until they reached a sustainable population).

In other words, humans are wired for tribalism, so don't be surprised when they start forming tribes...

guerrilla•4h ago
> One way that thinking for yourself goes wrong is that you realize your society is wrong about something, don’t realize that you can’t outperform it, and wind up even wronger.

I've been there myself.

> And without the steadying influence of some kind of external goal you either achieve or don’t achieve, your beliefs can get arbitrarily disconnected from reality — which is very dangerous if you’re going to act on them.

I think this and the entire previous two paragraphs preceding it are excellent arguments for philosophical pragmatism and empiricism. It's strange to me that the community would not have already converged on that after all their obsessions with decision theory.

> The Zizians and researchers at Leverage Research both felt like heroes, like some of the most important people who had ever lived. Of course, these groups couldn’t conjure up a literal Dark Lord to fight. But they could imbue everything with a profound sense of meaning. All the minor details of their lives felt like they had the fate of humanity or all sentient life as the stakes. Even the guilt and martyrdom could be perversely appealing: you could know that you’re the kind of person who would sacrifice everything for your beliefs.

This helps me understand what people mean by "meaning". A sense that their life and actions actually matter. I've always struggled to understand this issue but this helps make it concrete, the kind of thing people must be looking for.

> One of my interviewees speculated that rationalists aren’t actually any more dysfunctional than anywhere else; we’re just more interestingly dysfunctional.

"We're"? The author is a rationalist too? That would definitely explain why this article is so damned long. Why are rationalists not able to write less? It sounds like a joke but this is seriously a thing. [EDIT: Various people further down in the comments are saying it's amphetamines and yes, I should have known that from my own experience. That's exactly what it is.]

> Consider talking about “ethical injunctions:” things you shouldn’t do even if you have a really good argument that you should do them. (Like murder.)

This kind of defeats the purpose, doesn't it? Also, this is nowhere justified in the article, just added on as the very last sentence.

RareAz•4h ago
+17576171102 is my telegram username
duckmysick•4h ago
> And yet, the rationalist community has hosted perhaps half a dozen small groups with very strange beliefs (including two separate groups that wound up interacting with demons). Some — which I won’t name in this article for privacy reasons — seem to have caused no harm but bad takes.

So there's six questionable (but harmless) groups and then later the article names three of them as more serious. Doesn't seem like "many" to me.

I wonder what percentage of all cults are the rationalist ones.

hax0ron3•4h ago
The premise of the article might just be nonsense.

How many rationalists are there in the world? Of course it depends on what you mean by rationalist, but I'd guess that there are probably several tens of thousands, at very least, people in the world who either consider themselves rationalists or are involved with the rationalist community.

With such numbers, is it surprising that there would be half a dozen or so small cults?

There are certainly some cult-like aspects to certain parts of the rationalist community, and I think that those are interesting to explore, but come on, this article doesn't even bother to establish that its title is justified.

To the extent that rationalism does have some cult-like aspects, I think a lot of it is because it tends to attract smart people who are deficient in the ability to use avenues other than abstract thinking to comprehend reality and who enjoy making loosely justified imaginative leaps of thought while overestimating their own abilities to model reality. The fact that a huge fraction of rationalists are sci-fi fans is not a coincidence.

But again, one should first establish that there is anything actually unusual about the number of cults in the rationalist community. Otherwise this is rather silly.

pizzadog•4h ago
I have a lot of experience with rationalists. What I will say is:

1) If you have a criticism about them or their stupid name or how "'all I know is that I know nothing' how smug of them to say they're truly wise," rest assured they have been self flagellating over these criticisms 100x longer than you've been aware of their group. That doesn't mean they succeeded at addressing the criticisms, of course, but I can tell you that they are self aware. Especially about the stupid name.

2) They are actually well read. They are not sheltered and confused. They are out there doing weird shit together all the time. The kind of off-the-wall life experiences you find in this community will leave you wide eyed.

3) They are genuinely concerned with doing good. You might know about some of the weird, scary, or cringe rationalist groups. You probably haven't heard about the ones that are succeeding at doing cool stuff because people don't gossip about charitable successes.

In my experience, where they go astray is when they trick themselves into working beyond their means. The basic underlying idea behind most rationalist projects is something like "think about the way people suffer everyday. How can we think about these problems in a new way? How can we find an answer that actually leaves everyone happy?" A cynic (or a realist, depending on your perspective) might say that there are many problems that fundamentally will leave some group unhappy. The overconfident rationalist will challenge that cynical/realist perspective until they burn themselves out, and in many cases they will attract a whole group of people who burn out alongside them. To consider an extreme case, the Zizians squared this circle by deciding that the majority of human beings didn't have souls and so "leaving everyone happy" was as simple as ignoring the unsouled masses. In less extreme cases this presents itself as hopeless idealism, or a chain of logic that becomes so divorced from normal socialization that it appears to be opaque. "This thought experiment could hypothetically create 9 quintillion cubic units of Pain to exist, so I need to devote my entire existence towards preventing it, because even a 1% chance of that happening is horrible. If you aren't doing the same thing then you are now morally culpable for 9 quintillion cubic units of Pain. You are evil."

Most rationalists are weird but settle into a happy place far from those fringes where they have a diet of "plants and specifically animals without brains that cannot experience pain" and they make $300k annually and donate $200k of it to charitable causes. The super weird ones are annoying to talk to and nobody really likes them.

wredcoll•2h ago
But are they scotsmen?
gwbas1c•4h ago
One thing I'm having trouble with: The article assumes the reader knows some history about the rationalists.

I listened to a podcast that covered some of these topics, so I'm not lost; but I think someone who's new to this topic will be very, very, confused.

clueless•3h ago
I'm curious, what was the podcast episode?
guerrilla•52m ago
Here you go. It has like 10 chapters, so keep going once you reach the end.

https://aiascendant.substack.com/p/extropias-children-chapte...

kanzure•4h ago
Here are some other anti-lesswrong materials to consider:

https://aiascendant.com/p/extropias-children-chapter-1-the-w...

https://davidgerard.co.uk/blockchain/2023/02/06/ineffective-...

https://www.bloomberg.com/news/features/2023-03-07/effective...

https://www.vox.com/future-perfect/23458282/effective-altrui...

https://qchu.substack.com/p/eliezer

https://x.com/kanzure/status/1726251316513841539

meowface•2h ago
Note that Asterisk magazine is basically the unofficial magazine for the rationalism community and the author is a rationalist blogger who is naturally very pro-LessWrong. This piece is not anti-Yudkowsky or anti-LessWrong.

Here's a counter-piece on David Gerard and his portrayal of LessWrong and Effective Altruism: https://www.tracingwoodgrains.com/p/reliable-sources-how-wik...

antithesizer•3h ago
Little on offer but cults these days. Take your pick. You probably already did long ago and now your own cult is the only one you'll never clock as such.
noqc•1h ago
Same as it ever was, but with more of them, people are a little warier about their own, I think.
crazydoggers•2h ago
Trying to find life’s answers by giving over your self authority to another individual or group’s philosophy is not rational. Submitting oneself to an authority who’s role is telling people what’s best in life will always lead to attracting the type of people looking to control, take advantage and traumatize others.
Lerc•1h ago
Reading the other comments makes me wonder if they just misread the sign and they were looking for the rationalizationist meeting.
fuzztester•1h ago
Has anyone here ever been a part of a cult?

If so, got anything to share - anecdotes, learnings, cautions, etc.?

I am never planning to be part of one; just interested to know, partly because I have lived adjacent to what might be one, at times.

lyu07282•1h ago
There was this interview with Diane Benscoter who talked about her experience and reasons for joining a cult that I found very insightful: https://www.youtube.com/watch?v=6Ibk5vJ-4-o

The main point is that it isn't so much the cult (leader) so much as the victims being in a vulnerable mental state getting exploited.

gverrilla•1h ago
lol
marstall•1h ago
Harpers did an amazing cover story on these freaks in 2015 https://harpers.org/archive/2015/01/come-with-us-if-you-want...
homeonthemtn•55m ago
This just sounds like any other community based around a niche interest.

From kink to rock hounding, there's always people who base their identity on being a broker of status or power because they themselves are a powerless outsider once removed from the community

gchamonlive•49m ago
> base their identity on being a broker of status or power because they themselves are a powerless outsider once removed from the community

Who would ever maintain power when removed from their community? You mean to say they base their identity on the awareness of the power they possess within a certain group?

duxup•49m ago
I remember going to college and some graduate student, himself a philosophy major, telling me that nobody is as big a jerk as philosophy majors.

I don't know if it is really true, but it certainly felt true that folks looking for deeper answers about a better way to think about things end up finding what they believe is the "right" way and that tends to lead to branding other options as "wrong".

A search for certainty always seems to be defined or guided by people dealing with their own issues and experiences that they can't explain. It gets tribal and very personal and those kind of things become dark rabbit holes.

----

>Jessica Taylor, an AI researcher who knew both Zizians and participants in Leverage Research, put it bluntly. “There’s this belief [among rationalists],” she said, “that society has these really bad behaviors, like developing self-improving AI, or that mainstream epistemology is really bad–not just religion, but also normal ‘trust-the-experts’ science. That can lead to the idea that we should figure it out ourselves. And what can show up is that some people aren't actually smart enough to form very good conclusions once they start thinking for themselves.”

Reminds me of some members of our government and conspiracy theorists who "research" and encourage people to figure it out themselves ...

jmoggr•32m ago
I think the comments here have been overly harsh. I have friends in the community and have visited the LessWrong "campus" several times. They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb (in hopefully somewhat respectful manner).

As for the AI doomerism, many in the community have more immediate and practical concerns about AI, however the most extreme voices are often the most prominent. I also know that there has been internal disagreement on the kind of messaging they should be using to raise concern.

I think rationalists get plenty of things wrong, but I suspect that many people would benefit from understanding their perspective and reasoning.

a_bonobo•23m ago
This is a great article.

There's so much in these group dynamics that repeats group dynamics of communist extremists of the 70s.

Compare this part from OP:

>Here is a sampling of answers from people in and close to dysfunctional groups: “We spent all our time talking about philosophy and psychology and human social dynamics, often within the group.” “Really tense ten-hour conversations about whether, when you ate the last chip, that was a signal that you were intending to let down your comrades in selfish ways in the future.”

This reeks of Marxist-Leninist self-criticism, where everybody tried to up each other in how ideologically pure they were. The most extreme outgrowing of self-criticism is when the Japanese United Red Army beat its own members to death as part of self-criticisms.

>'These violent beatings ultimately saw the death of 12 members of the URA who had been deemed not sufficiently revolutionary.' https://en.wikipedia.org/wiki/United_Red_Army

History doesn't repeat, but it rhymes.