Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.
No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.
The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.
It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.
The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.
I’m not disputing that they coördinated. I’m challenging that they were coerced.
We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.
https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))
Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.
https://blog.youtube/news-and-events/managing-harmful-vaccin...
From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.
I know that some services do this in addition to account ban.
Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.
We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.
This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.
For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.
The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...
Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...
It would be more surprising if they left Google alone.
We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.
For some reason, that didn't work either.
What is going to work? And what is your plan for getting us to that point?
People can post all sorts of crazy stuff, but the algorithms do not need to promote it.
Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.
If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:
Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?
Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?
In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.
The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.
How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?
And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.
Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.
In this case it wasn't a purely private decision.
1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.
2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes
3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.
That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.
Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.
> What if they started banning tylenol-autism sceptical accounts?
What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.
If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.
If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.
If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?
The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.
This throws out spam and fraud filters, both of which are content-based moderation.
Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.
The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.
What you are arguing for is a dissolution of HN and sites like it.
As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.
I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.
Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.
As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)
There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.
> They've completely taken over public discourse on a wide range of subjects
Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.
If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).
It's often a lot better to just let kooks speak freely.
There is nobody more confident in themselves than the middle-class.
Add in a healthy dose of subconsciously racist beliefs about how advanced Western society is (plus ideas of how this means they must be smart too) and how catching diseases preventable by vaccines is only a brown people thing.
Basically, it's easy to be anti-vax when the disease isn't in your face and you have an out-group to blame even if it does end up in your face (a common excuse by anti-vaxxers I see when measles is in the news is that the immigrants are bringing it in and should be blamed instead of anti-vaxxers)
What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.
Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?
I’m truly trying to learn here and square this statement with what I’ve come to understand so far.
Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, or will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence).
Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186
Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.
Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.
Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.
James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.
James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59
NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.
Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.
Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.
Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.
Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.
Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.
Joy Garner / NVKP surveys – activist-run online surveys with no verification.
Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.
Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.
"We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."
Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."
I understand being skeptical about vaccines, but the skepticism needs to go both ways
Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...
If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.
Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.
Do you also have theories about autism you'd like to share with the class?
So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?
To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.
We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.
And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.
No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.
It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.
I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.
You hate one thing in a message from a side, and you switch sides.. without accounting for all the things the other side stands for. It's a complicated inversion of single-issue purity tests.
The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .
People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.
If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.
More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.
The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them.
The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement. But in measurements such that one outcome or the other does not bias being overlooked in this fashion, there's no measurement time bias. The authors do not explain why measurement time would have anything to do with detecting or not detecting Covid death rates in the abstract, or anywhere else in the paper, because they are idiot quacks who want to adjust statistics to give the answer they want for no justifiable reason.
I did not read the second paper.
Trump thought so too.
Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.
The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.
Is it? How does that work at scale?
Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).
Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.
It massively amplified the nuts. It brought it to the mainstream.
I'm a bit amazed seeing people still justifying it after all we've learned.
COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.
And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.
But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.
And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.
Look at twitter before and after Musk, and tell me again that deplatforming doesn't work.
There's a reason you don't fan the flames of disinformation. Groups of people cannot be reasoned with like you can reason with an individual.
[1] https://systemicjustice.org/article/facebook-and-genocide-ho...
I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?
Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?
This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.
"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".
Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.
What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?
Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.
I’d also argue for demonetising political content, but idk if that would fly.
Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.
Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)
> For all content or just “political”?
The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.
I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)
Bonus: electeds get constituent pressure to consolidate elections.
Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.
It's taking a sword to the surgery room where no scalpel has been invented yet.
We need better tools to combat dis/mis-information.
I wish I knew what that tool was.
Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?
Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.
As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.
Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.
The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.
That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.
There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.
Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.
Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.
Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.
Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...
We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).
Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.
The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.
The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".
I weep for the human race because we are not going to make it.
My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.
I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.
It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.
- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864
Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.
The first amendment was written in the 1700s...
The US military also promoted anti-vax propaganda in the Philippines [0].
A lot of the comments here raise good points about silencing well meaning people expressing their opinion.
But information warfare is a fundamental part of modern warfare. And it's effective.
An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.
So
> I think we have to realize silencing people doesn't work
it seems to have been reasonably effective at combating disinformation networks
> It just causes the ideas to metastasize
I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.
[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...
The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.
I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.
For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.
My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.
So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.
First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.
Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.
This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.
The next Drain-o chug challenge "accident" is inevitable, at this rate.
Future tense?
On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.
(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)
Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance
Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.
..but i'm not a yter.
First of all, you can't separate a thing's content from the platform it's hosted on? Really?
Second of all, this is why
I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)
https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...
https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...
https://rumble.com/vt62y6-covid-19-a-second-opinion.html
https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...
I could go on. Feel free if you want to see more. :)
(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)
The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.
Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.
In many governments, the government can do no wrong. There are no checks and balances.
The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.
But hopefully we will still have a system that can have room for critique in the years to come.
pessimizer•2h ago
Actual letter: https://judiciary.house.gov/sites/evo-subsites/republicans-j...
Good editorial: https://www.businessinsider.com/google-meta-congress-letter-...
murphyslab•2h ago
- https://www.engadget.com/big-tech/youtube-may-reinstate-chan...
- https://arstechnica.com/gadgets/2025/09/youtube-will-restore...
topspin•2h ago
Yes, I know about the Charlie Kirk firings etc.
dang•55m ago