Could X just basically stop moderating all together? The one (many?) conflicts here is that they legally have to moderate some things (CSAM) and there would be conflict in terms of moderating adult content. Basically is the law consistent enough to adopt a hands off strategy to maintain liability protection? Or would you be forced to go the other direction.
i work in company that provides some enterprise messaging and we were rather surprised to find few years ago that there was a bunch of people who used our service for CSAM sharing. I had friends in other industries run into cases where there products (not even chat) were (ab)used for same purpose
An algorithmic feed is one of the things that would make them a publisher without Section 230. So, they could, but they wouldnt be anything like X anymore.
> Basically is the law consistent enough to adopt a hands off strategy to maintain liability protection?
No, that’s why section 230 was adopted, to address an existential legal threat to any site of non-trivial scale woth user generated content. Withoutt section 230 or a radical revision of lots of other law, the only practical option is for providers to do as much review and editing of, and accept the same liability for, UGC as they would for first-party content.
If you wanted to tighten things up without intentionally nuking UGC as a viable thing for internet businesses practically subject to US jurisdiction, you could revise 230 to explicitly not remove distributor liability (it doesn't actually say it does and the extension to do this by the courts was arguably erroneous), which would give sites an obligation to respond to actual knowledge of unlawful content but not presume it from the act of presenting the content. But the “repeal 230” group isn't trying to solve problems.
In Anderson v TikTok, the appeals court decided that since the little girl did not specifically search for the videos she watched, TikTok’s algorithm made what amounted to an editorial decision to show her the videos she watched and thus Section 230 did not give them any protection. TikTok ultimately chose not to appeal to the Supreme Court and thus this is the current state of the law in Pennsylvania, New Jersey and Delaware. Other courts may decide differently.
The general idea is that whenever algorithms are deciding what you see Section 230 is not in play - but the First Amendment might be. The Supreme Court hinted that this is how they view things, BTW. If this is how it is, then Section 230 is essentially a dead law already and losing it only affects old fashioned blogs and forums.
But blogs and forums should be able to exist.
I think I'm implicitly assuming that laws are equally applied, which is increasingly untrue.
We really don’t know this.
tguvot•1h ago
"It has been nearly 30 years since Congress passed Section 230 and gave platforms immunity from lawsuits.
Our children deserve a safer internet and tech companies need to be held accountable for what happens on their platforms."