frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

YouTube says it'll bring back creators banned for Covid and election content

https://www.businessinsider.com/youtube-reinstate-channels-banned-over-covid-content-policies-2025-9
160•delichon•3h ago

Comments

pessimizer•2h ago
Better article: https://www.businessinsider.com/youtube-reinstate-channels-b...

Actual letter: https://judiciary.house.gov/sites/evo-subsites/republicans-j...

Good editorial: https://www.businessinsider.com/google-meta-congress-letter-...

murphyslab•2h ago
Two articles that I found offered a well-rounded analysis:

- https://www.engadget.com/big-tech/youtube-may-reinstate-chan...

- https://arstechnica.com/gadgets/2025/09/youtube-will-restore...

topspin•2h ago
All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.

Yes, I know about the Charlie Kirk firings etc.

dang•55m ago
Ok, we've changed the URL above to that first link from https://www.offthepress.com/youtube-will-let-users-booted-fo.... Thanks!
lesuorac•2h ago
2 years is a pretty long ban for a not even illegal conduct.

Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

Simulacra•1h ago
They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.
LeafItAlone•57m ago
And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.
JumpCrisscross•50m ago
> wasn't Google/Youtube banning so much as government ordering private companies to do so

No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

starik36•46m ago
That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.
JumpCrisscross•43m ago
> was certainly the case with Twitter

It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.

The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.

brokencode•43m ago
A direct line to threaten decision makers? Or to point out possible misinformation spreaders?
spullara•44m ago
They literally had access to JIRA at Twitter so they could file tickets against accounts.
JumpCrisscross•40m ago
> literally had access to JIRA at Twitter so they could file tickets against accounts

I’m not disputing that they coördinated. I’m challenging that they were coerced.

We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.

unethical_ban•13m ago
Do you think no nefarious nation state actors are on social media spinning disinformation?
whycome•2h ago
What exactly constituted a violation of a COVID policy?
perihelions•2h ago
According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

delichon•2h ago
My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.
barbacoa•1h ago
Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.
potsandpans•50m ago
Can you please provide evidence? I'm not saying I don't believe you. It's just... extraordinary claims etc
barbacoa•41m ago
https://reclaimthenet.org/google-drive-takes-down-user-file-...
carlosjobim•2h ago
Every opinion different from the opinion of "authorities". They documented it here:

https://blog.youtube/news-and-events/managing-harmful-vaccin...

From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

PaulKeeble•1h ago
A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.
potsandpans•51m ago
Saying lab leak was true
moomoo11•2h ago
I think hardware and ip level bans.. should be banned.

I know that some services do this in addition to account ban.

ocdtrekkie•1h ago
Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.
jjk166•52m ago
If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.

Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.

ocdtrekkie•47m ago
Yes, we should let people "self-incriminate" with Tor and disposable email services...
JumpCrisscross•32m ago
> If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life

We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.

cactusplant7374•1h ago
> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

softwaredoug•1h ago
It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.
HankStallone•1h ago
It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

dotnet00•1h ago
To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.
CSMastermind•12m ago
Why wouldn't you buy it?

The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

It would be more surprising if they left Google alone.

softwaredoug•1h ago
I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.
vkou•1h ago
> But I think we have to realize silencing people doesn't work.

We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.

For some reason, that didn't work either.

What is going to work? And what is your plan for getting us to that point?

_spduchamp•6m ago
Algorithmic Accountability.

People can post all sorts of crazy stuff, but the algorithms do not need to promote it.

Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.

hash872•1h ago
It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:

Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?

Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?

softwaredoug•1h ago
I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.
andrewmcwatters•1h ago
Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.
benjiro•24m ago
Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.

In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.

The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.

rahidz•1h ago
"Where's the limiting principle here?"

How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?

And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.

Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.

drak0n1c•53m ago
Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...

In this case it wasn't a purely private decision.

TeeMassive•29m ago
> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.

2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes

3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.

andy99•1h ago
The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?
mapontosevenths•1h ago
> the government and/or a big tech company shouldn't decide what people are "allowed" to say.

That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

> What if they started banning tylenol-autism sceptical accounts?

What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

MostlyStable•55m ago
It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.
lkey•16m ago
Or it might be the case that that 'culture' is eroding the thing it claims to be protecting. https://www.popehat.com/p/how-free-speech-culture-is-killing...
plantwallshoe•13m ago
Isn’t promoting/removing opinions you care about a form of speech?

If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.

If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.

If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?

lmz•7m ago
Agreed. If I have a TV network and think these anti-government hosts on my network are bad for business, that is also freedom of speech.
mc32•21m ago
The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.
briHass•18m ago
The line should be what is illegal, which, at least in the US, is fairly permissive.

The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

JumpCrisscross•36m ago
> the government and/or a big tech company shouldn't decide what people are "allowed" to say

This throws out spam and fraud filters, both of which are content-based moderation.

Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

zetazzed•19m ago
Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

heavyset_go•12m ago
This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

What you are arguing for is a dissolution of HN and sites like it.

asadotzler•3m ago
No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

kypro•1h ago
I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

vkou•1h ago
> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.

As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)

kypro•1h ago
I agree. Again the vast majority would have gotten the vaccine.

There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.

> They've completely taken over public discourse on a wide range of subjects

Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.

If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).

trollbridge•1h ago
And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.

It's often a lot better to just let kooks speak freely.

vFunct•43m ago
It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.

There is nobody more confident in themselves than the middle-class.

khazhoux•33m ago
That’s a very confident statement presented without a hint of evidence.
dotnet00•1h ago
I think the anti-vax thing is mostly because the average Western education level is just abysmal.

Add in a healthy dose of subconsciously racist beliefs about how advanced Western society is (plus ideas of how this means they must be smart too) and how catching diseases preventable by vaccines is only a brown people thing.

Basically, it's easy to be anti-vax when the disease isn't in your face and you have an out-group to blame even if it does end up in your face (a common excuse by anti-vaxxers I see when measles is in the news is that the immigrants are bringing it in and should be blamed instead of anti-vaxxers)

xdennis•1h ago
> I think the anti-vax thing is mostly because the average Western education level is just abysmal.

What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.

andrewmcwatters•58m ago
And yet, SEA and others are still better educated than us.
LeafItAlone•29m ago
>SEA and others are still better educated than us.

Honest question: is this true? What’s the data around this? If it is true, why are there so many people from SEA in American universities? Wouldn’t they stay in their home country or another in the area?

I’m truly trying to learn here and square this statement with what I’ve come to understand so far.

dotnet00•49m ago
They're into folk medicine, but their anti-vax issues generally come from people who don't have any means of knowing better (i.e. never been to school, dropped out at a very early grade, isolated, not even literate). Typically just education and having a doctor or a local elder respectfully explain to them that the Polio shot will help prevent their child from being paralyzed for life is enough to convince them.

Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, or will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence).

logicchains•1h ago
The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.

Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.

Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.

Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.

James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.

James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59

NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.

Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.

Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.

Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.

jawarner•46m ago
Mawson et al. 2017 (two papers) – internet survey of homeschoolers recruited from anti-vaccine groups; non-random, self-reported, unverified health outcomes. Retracted by the publisher after criticism.

Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.

Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.

Joy Garner / NVKP surveys – activist-run online surveys with no verification.

Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.

Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.

MSM•45m ago
I picked one at random (NVKP, "Diseases and Vaccines: NVKP Survey Results") and, while I needed to translate it to read it, it's clear (and loud!) about not actually being a scientific study.

"We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."

Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."

I understand being skeptical about vaccines, but the skepticism needs to go both ways

TimorousBestie•43m ago
> Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...

If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.

lkey•43m ago
"If they were as safe as other treatments they wouldn't need a blanket liability immunity." Citation very much needed for this inference.

Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.

Do you also have theories about autism you'd like to share with the class?

TimorousBestie•32m ago
A very good point. These studies should be comparing QALYs (quality-adjusted life years, a measure of disease burden) instead of relative prevalence of a handful of negative outcomes, the latter of which is much more vulnerable to p-hacking.
conception•25m ago
Here’s where the “bad ideas out in the open get corrected” now is tested. There are 4 really good refutations of your evidence. Outside of the unspoken “perhaps vaccines cause some measurable bad outcomes but compare then to measles. And without the herd immunity vaccinations aren’t nearly as useful” argument.

So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?

tnias23•19m ago
The studies you cite are the typical ones circulated by antivaxers and are not considered credible by the medical community due to severe methodological flaws, undisclosed biases, retractions, etc.

To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.

kypro•54m ago
Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.

We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.

And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.

No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.

dotnet00•28m ago
Anti-vax was enough of an issue that vaccine mandates were necessary for Covid.

It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.

I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.

mrcwinn•21m ago
If that were the case, wouldn’t we see vaccine skepticism in poorly educated, racist non-Western nations?
braiamp•5m ago
You don't see those, because it's on their faces. Or more accurately on our faces. I live in such country, and we kill for having our kids vaccinated. We live these diseases, so we aren't so stupid to fall for misinformation.
browningstreet•6m ago
I have doctor friends in the anti-vaccine, anti mRNA camp. I'm not with them, and I haven't found their evidence convincing, but I'd more readily ascribe a lot of these digressions to conspiracy oriented, argument slicing, social media fueled mis-alignments.

You hate one thing in a message from a side, and you switch sides.. without accounting for all the things the other side stands for. It's a complicated inversion of single-issue purity tests.

logicchains•1h ago
>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .

rpiguy•55m ago
I appreciate you.

People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.

If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.

More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.

cynicalkane•11m ago
This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few randos who have the truth.

The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them.

The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement. But in measurements such that one outcome or the other does not bias being overlooked in this fashion, there's no measurement time bias. The authors do not explain why measurement time would have anything to do with detecting or not detecting Covid death rates in the abstract, or anywhere else in the paper, because they are idiot quacks who want to adjust statistics to give the answer they want for no justifiable reason.

I did not read the second paper.

tonfreed•1h ago
The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other
slater-•1h ago
>> The best disinfectant is sunlight.

Trump thought so too.

thrance•1h ago
How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.

Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.

andrewmcwatters•59m ago
Well, people literally died. So, I think we all know how it played out.

The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.

LeafItAlone•35m ago
>The best disinfectant is sunlight.

Is it? How does that work at scale?

Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).

Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.

TeeMassive•28m ago
What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.
LeafItAlone•21m ago
I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible. I don’t have solution, but what we have not is clearly not working.
Aloha•1h ago
I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.
llm_nerd•16m ago
It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

It massively amplified the nuts. It brought it to the mainstream.

I'm a bit amazed seeing people still justifying it after all we've learned.

COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

thrance•1h ago
Sure, let the right-wing propaganda machine churn lies and misinformation full-blast, maybe people will magically come to their senses and realize that, no, vaccines and paracetamol don't cause autism, or that the 2020 election wasn't stolen.

Look at twitter before and after Musk, and tell me again that deplatforming doesn't work.

heavyset_go•1h ago
When the pogroms[1] start, it will be a luxury to let it ride out so you can roll your eyes at it.

There's a reason you don't fan the flames of disinformation. Groups of people cannot be reasoned with like you can reason with an individual.

[1] https://systemicjustice.org/article/facebook-and-genocide-ho...

breadwinner•50m ago
> silencing people doesn't work

I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?

Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?

JumpCrisscross•50m ago
Slow down our algorithmic hell hole. Particularly around elections.
breadwinner•44m ago
If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.

"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".

JumpCrisscross•42m ago
> If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.

LeafItAlone•40m ago
>Slow down our algorithmic hell hole.

What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

JumpCrisscross•38m ago
> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

I’d also argue for demonetising political content, but idk if that would fly.

LeafItAlone•25m ago
Ok, but how does that get implemented? Not technically, but who makes it happen and enforces the rules? For all content or just “political”? Who decides what’s “political”? Information about the disease behind a worldwide pandemic isn’t inherently “political”, but somehow it became so.

Who decides agar falls in this bucket. The government? That seems to go against the idea of restricting speech and ideas.

JumpCrisscross•23m ago
> who makes it happen and enforces the rules?

Congress for the first. Either the FCC or, my preference, private litigants for the second. (Treble damages for stupid suits, though.)

> For all content or just “political”?

The courts can already distinguish political speech from non-political speech. But I don’t trust a regulator to.

I’d borrow from the French. All content within N weeks of an in the jurisdiction. (I was going to also say any content that mentions an elected by name, but then we’ll just get meme names and nobody needs that.)

Bonus: electeds get constituent pressure to consolidate elections.

Alternative: these platforms already track trending topics. So an easy fix is to slow down trending topics. It doesn’t even need to be by that much, what we want is for people to stop and think and have a chance to reflect on what they do, maybe take a step away from their device while at it.

TeeMassive•37m ago
Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?
altruios•31m ago
Censorship is a tool to combat misinformation.

It's taking a sword to the surgery room where no scalpel has been invented yet.

We need better tools to combat dis/mis-information.

I wish I knew what that tool was.

Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?

deegles•34m ago
no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"
NullCascade•19m ago
Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.

Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.

As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.

lkey•33m ago
I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.
mvdtnz•31m ago
And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.
lkey•21m ago
These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)
mvdtnz•16m ago
And most people roll their eyes and don't believe it. Which is why it's a good idea not to make it true.
lkey•29m ago
To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.
benjiro•30m ago
Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).

Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.

The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.

That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.

There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.

Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.

Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.

Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.

Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...

We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).

Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.

The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.

The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".

I weep for the human race because we are not going to make it.

yojo•14m ago
I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

dawnerd•9m ago
It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.
braiamp•7m ago
> But I think we have to realize silencing people doesn't work

It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.

- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864

Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.

sazylusan•7m ago
Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.
sazylusan•4m ago
Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

The first amendment was written in the 1700s...

ants_everywhere•5m ago
These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.

The US military also promoted anti-vax propaganda in the Philippines [0].

A lot of the comments here raise good points about silencing well meaning people expressing their opinion.

But information warfare is a fundamental part of modern warfare. And it's effective.

An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.

So

> I think we have to realize silencing people doesn't work

it seems to have been reasonably effective at combating disinformation networks

> It just causes the ideas to metastasize

I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.

[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...

woeirua•1h ago
It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

terminalshort•1h ago
The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.
TremendousJudge•1h ago
"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
woeirua•16m ago
Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!
stronglikedan•1h ago
The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

theossuary•1h ago
The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.
hsbauauvhabzb•37m ago
Algorithms that reverse the damage by providing opposing opinions could be implemented.
squigz•5m ago
I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.
kypro•1h ago
I've argued this before, but the algorithms are not the core problem here.

For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

woeirua•18m ago
I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.
CobrastanJorji•17m ago
Yeah, there are two main things here that are being conflated.

First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

rustystump•55m ago
The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.

cbradford•49m ago
So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity
johnnyanmac•42m ago
yeah, 2025 in a nutshell. The year of letting all the grifts thrive.
lazyeye•42m ago
What should the punishment be for having opinions the govt disagrees with?
th0ma5•42m ago
Notoriety
lazyeye•38m ago
Yep..and fame, admiration, contempt, loathing, indifference etc
Supermancho•37m ago
Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.

The next Drain-o chug challenge "accident" is inevitable, at this rate.

JumpCrisscross•42m ago
> they will all do it over again at the next opportunity

Future tense?

bluedino•41m ago
I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.
c-hendricks•36m ago
Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.
mvdtnz•21m ago
Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.
pinkmuffinere•8m ago
Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.
alex1138•8m ago
Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics
guelo•40m ago
The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.
dang•30m ago
If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.

On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.

(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)

https://news.ycombinator.com/newsguidelines.html

alex1138•5m ago
Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned
diego_sandoval•32m ago
At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

[1] https://www.bbc.com/news/technology-52388586

hyperhopper•15m ago
The united states also said not to buy masks and that they were ineffective during the pandemic.

Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

danparsonson•4m ago
> the WHO contradicted itself many times during the pandemic

Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

serf•25m ago
i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.

..but i'm not a yter.

TeMPOraL•19m ago
It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.
alex1138•14m ago
So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source

First of all, you can't separate a thing's content from the platform it's hosted on? Really?

Second of all, this is why

I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)

https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...

https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...

https://rumble.com/vt62y6-covid-19-a-second-opinion.html

https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...

I could go on. Feel free if you want to see more. :)

(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)

ironman1478•12m ago
There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

alex1138•11m ago
Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right
reop2whiskey•7m ago
What if the government is the source of misinformation?
pcdoodle•11m ago
So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.
system7rocks•7m ago
We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

In many governments, the government can do no wrong. There are no checks and balances.

The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

But hopefully we will still have a system that can have room for critique in the years to come.

Find SF parking cops

https://walzr.com/sf-parking/
469•alazsengul•5h ago•276 comments

Qwen3-VL: Sharper vision, deeper thought, broader action

https://qwen.ai/blog?id=99f0335c4ad9ff6153e517418d48535ab6d8afef&from=research.latest-advancement...
93•natrys•2h ago•18 comments

Libghostty is coming

https://mitchellh.com/writing/libghostty-is-coming
480•kingori•9h ago•138 comments

From Rust to reality: The hidden journey of fetch_max

https://questdb.com/blog/rust-fetch-max-compiler-journey/
55•bluestreak•2h ago•8 comments

Markov chains are the original language models

https://elijahpotter.dev/articles/markov_chains_are_the_original_language_models
221•chilipepperhott•4d ago•94 comments

YouTube says it'll bring back creators banned for Covid and election content

https://www.businessinsider.com/youtube-reinstate-channels-banned-over-covid-content-policies-2025-9
165•delichon•3h ago•219 comments

Kitty – GPU based terminal emulator

https://sw.kovidgoyal.net/kitty/
19•andsoitis•3d ago•8 comments

Getting AI to work in complex codebases

https://github.com/humanlayer/advanced-context-engineering-for-coding-agents/blob/main/ace-fca.md
182•dhorthy•9h ago•200 comments

Go has added Valgrind support

https://go-review.googlesource.com/c/go/+/674077
452•cirelli94•14h ago•118 comments

How to draw construction equipment for kids

https://alyssarosenberg.substack.com/p/how-to-draw-construction-equipment
69•holotrope•4h ago•30 comments

Context Engineering for AI Agents: Lessons

https://manus.im/blog/Context-Engineering-for-AI-Agents-Lessons-from-Building-Manus
22•helloericsf•2h ago•1 comments

Launch HN: Strata (YC X25) – One MCP server for AI to handle thousands of tools

110•wirehack•8h ago•58 comments

Is Fortran better than Python for teaching basics of numerical linear algebra?

https://loiseaujc.github.io/posts/blog-title/fortran_vs_python.html
33•Bostonian•3h ago•34 comments

From MCP to shell: MCP auth flaws enable RCE in Claude Code, Gemini CLI and more

https://verialabs.com/blog/from-mcp-to-shell/
110•stuxf•8h ago•30 comments

Always Invite Anna

https://sharif.io/anna-alexei
551•walterbell•7h ago•52 comments

Podman Desktop celebrates 3M downloads

https://podman-desktop.io/blog/3-million
22•twelvenmonkeys•2h ago•0 comments

Apple A19 SoC die shot

https://chipwise.tech/our-portfolio/apple-a19-dieshot/
59•giuliomagnifico•4h ago•28 comments

Show HN: Ggc – A Git CLI tool written in Go with interactive UI

https://github.com/bmf-san/ggc/releases/tag/v6.0.0
13•bmf-san•3d ago•0 comments

Mesh: I tried Htmx, then ditched it

https://ajmoon.com/posts/mesh-i-tried-htmx-then-ditched-it
159•alex-moon•11h ago•105 comments

Is life a form of computation?

https://thereader.mitpress.mit.edu/is-life-a-form-of-computation/
46•redeemed•2h ago•45 comments

consumed.today

https://consumed.today/
135•burkaman•4h ago•27 comments

Denmark wants to push through Chat Control

https://netzpolitik.org/2025/internes-protokoll-daenemark-will-chatkontrolle-durchdruecken/
181•Improvement•4h ago•85 comments

Triple Buffering in Rendering APIs

https://www.4rknova.com//blog/2025/09/12/triple-buffering
21•ibobev•3d ago•1 comments

Shopify, pulling strings at Ruby Central, forces Bundler and RubyGems takeover

https://joel.drapper.me/p/rubygems-takeover/
393•bradgessler•8h ago•252 comments

Zip Code Map of the United States

https://engaging-data.com/us-zip-code-map/
80•helle253•8h ago•82 comments

Getting More Strategic

https://cate.blog/2025/09/23/getting-more-strategic/
147•gpi•10h ago•20 comments

Android users can now use conversational editing in Google Photos

https://blog.google/products/photos/android-conversational-editing-google-photos/
114•meetpateltech•6h ago•106 comments

Show HN: The Blots Programming Language

https://blots-lang.org/
33•paulrusso•4d ago•9 comments

Zinc (YC W14) Is Hiring a Senior Back End Engineer (NYC)

https://app.dover.com/apply/Zinc/4d32fdb9-c3e6-4f84-a4a2-12c80018fe8f/?rs=76643084
1•FriedPickles•11h ago

Structured Outputs in LLMs

https://parthsareen.com/blog.html#sampling.md
192•SamLeBarbare•12h ago•84 comments