https://en.wikipedia.org/wiki/Protests_against_SOPA_and_PIPA...
https://upload.wikimedia.org/wikipedia/foundation/d/d9/Wikim...
The categorisation regulations are a statutory instrument rather than primary legislation, so they _are_ open to judicial review. But the Wikimedia foundation haven't presented an argument as to why the regulations are unlawful, just an argument for why they disagree with them.
It should be noted that even if they succeed (which seems a long shot), this wouldn't affect the main thrust of the Online Safety Act which _is_ primary legislation and includes the bit making the rounds about adult content being locked behind age verification.
A lot of it. Often in high quality and with a permissible license.
I would link to relevant meta pages but I want to be able travel through LHR.
Scroll to "Who falls under Category 1"
https://medium.com/wikimedia-policy/wikipedias-nonprofit-hos...
It seems to be a fairly standard judicial review: if OFCOM(?) class them as "category 1", they are under a very serious burden, so they want the categorization decision reviewed in court.
Very interested how this goes.
It is actually (as noted in many previous discussion about the Online Safety Act) pushing people to using big tech platforms, because they can no longer afford the compliance cost and risk of running their own.
so big tech platforms will cheerfully embrace it. as expected, major players love regulations.
This isn't small advertisers' faults, the law signed their death warrant. They made local grocery stores more expensive and worse quality but kept Walmart around untouched. No one could predict what would happen.
Likewise, I suspect that most geoblocks are out of misplaced fear not actual analysis.
Moderation is part and parcel of running forums and all platforms and software provide tools for this, it's nothing new. If someone is not prepared to read submissions or to react quickly when one is flagged then perhaps running a forum is too much of a commitment for them but I would not blame the law.
In fact I believe that forum operators in the UK already got in legal trouble in the past, long before the Online Safety Act, because they ignored flagging reports.
Does the Act specify "quickly"? Does several hours count as "quickly"?
The Act (section 10 about illegal content) says that "In determining what is proportionate for the purposes of this section, the following factors, in particular, are relevant—
(a) all the findings of the most recent illegal content risk assessment
(b) the size and capacity of the provider of a service."
"24 hour coverage" is the maximum that can be achieved so it's not going to be proportionate in many, if not most, cases. People have to ask themselves if it is proportionate for a one-man gardening forum to react within 5 minutes at 3am, and the answer is not going to be "yes".
Obviously you can also automatically hide a flagged submission until it is reviewed or have keywords-based checks, etc. I believe these are a common functionalities and they will likely develop more (and yes, a consequence might be to push more people towards big platforms).
People need to have a calm analysis, not hysteria or politically-induced obtuseness whatever one might think of this Act. If they are a small and not in the UK they can probably completely ignore in any case.
Given the UK already has a “watershed” time where terrestrial TV can broadcast mature content between set hours (from 9pm), I cannot see why the same expectation shouldn’t exist that moderation isn’t happening outside of reasonable hours.
Typically with laws in the UK (and EU too) is to use more generalised language to allow the law a little more flexibility to apply correctly for more nuanced circumstances. Such as what is practical for a small forum to achieve when its specialty isn’t anything to do with adult content.
You’ll definitely find examples where such laws are abused from time to time. But they’re uncommon enough that they make national news and create an uproar. Thus the case goes nowhere due to the political embarrassment that department draws to itself.
Though to be clear, I’m not defending this particular law. It’s stupid and shouldn’t exist.
Many cited the uncertainty about what is actually required, the potential high cost of compliance, the danger of failing to correctly follow the rules they're not certain about, and the lack of governmental clarity as significant aspects of their decision to close.
The fear may be misplaced, but the UK government has failed to convince people of that.
I mean, it’s not like this particular piece of legislation isn’t stupid to begin with. So I cannot blame people for assuming the worst.
I suspect any smaller site that claims the Online Safety Act was a reason they closed, needed to close due to other complications. For example an art site that features occasional (or more) artistic nudes. Stuff that normal people wouldn’t consider mature content but the site maintainers wouldn’t want to take the risk on.
Either way, whether I’m right or wrong here, I still think the Online Safety Act is grotesque piece of legislation.
HN has already has discussed things like the cycling forum that hit down. lobste.rs considered blocking UK IPs. I was considering setting up a forum to replace/complement FB groups I help admin (home education related). This is enough to put me off as I do not want the hassle and risk of dealing with it.
I think what you are missing is that this does not just cover things like porn videos and photos. That is what has been emphasised by the media, but it covers a lot of harmful content: https://www.legislation.gov.uk/ukpga/2023/50/section/62
It took a fair amount of legal analysis to establish blog comments are OK (and its not clear whether off topic ones are). Links to that and other things here: https://www.theregister.com/2025/02/06/uk_online_safety_act_...
> I think the impact is a lot worse than that. There are still compliance costs especially for volunteer run sites. Ofcom says these are negligible, because they its unlikely to be more than "a few thousand pounds". Then there are the risks if something goes wrong if you have not incorporated.
What are these "compliance costs"? There's no forms that need to be completed. Sites don't have to register themselves. For smaller sites, the cost is just what I described: the time and effort of volunteer moderators who already moderate the site. If they're already removing adult content, then there's no extra work for them.
> HN has already has discussed things like the cycling forum that hit down. lobste.rs considered blocking UK IPs. I was considering setting up a forum to replace/complement FB groups I help admin (home education related). This is enough to put me off as I do not want the hassle and risk of dealing with it.
None of this proves your point though. It just proves that some sites are worried about potential overreach. It's an understandable concern but it a different problem to the one the GP was describing in that it doesn't actually make it any harder for smaller forums in any tangible way. Unless you called "spooked" a tangible cost (I do not).
> I think what you are missing is that this does not just cover things like porn videos and photos.
I didn't miss that. But you're right to raise that nonetheless.
There's definitely a grey area that is going to concern a lot of people but no site is going to be punished for mild, or occasional "breaches". What the government are trying to police is the stuff that's clearly inappropriate for under-18s. The UK (and EU in general) tends to pass laws that can be a little vague in definition and trust the police and courts to uphold "the spirit of the law". A little like how US laws can be defined by past cases and their judgments. This ambiguity will scare American sites because it's not how American law works. But the UK system does _generally_ work well. We do have instances where such laws are abused but they're infrequent enough to make national news and subsequently get dropped because of the embarrassment it brings to their department.
That all said, I'm really not trying to defend this particular law. The Online Safety Act is definitely a _bad_ law and I don't personally know of anyone in the UK (outside of politicians) who actually agrees with it.
One of us has completely misunderstood the legislation.
By my reading - there's a ton of red tape and paperwork. Heck, there's a ton of work even getting to the point of understanding what work you need to do. And dismissing the fear of life-changing financial liability as "being spooked" is not helpful.
I've got a open-source 3D sharing site almost ready to launch and I'm considering geo-blocking the UK. And I live in the UK.
It might help if you referenced the section what defines those requirements.
I don’t recall seeing anything that required such red tape unless there was special circumstances after the fact (for example, reporting child porn that was uploaded to your site, or responding to a police or court order).
But these kinds of rules exist for freedom of information et al too.
Maybe I’ve missed something though?
> Heck, there's a ton of work even getting to the point of understanding what work you need to do.
That is a fair point.
Unfortunately it’s also not novel to this legislation. Running any site that allows for public contributions opens one’s self to lots of different laws from lots of different countries. For some counties in the EU, Nazi content is illegal. Different countries have different rules around copyright. Then there’s laws around data protection, consent, and so on and so forth.
This law certainly doesn’t make things any easier but there has been a requirement to understand this stuff for decades already. So it’s a bit of a stretch to say this one new law suddenly makes a burden to run a site insurmountable.
However I do agree with your more general point that it’s getting very hard to navigate all of these local laws at scale.
> And dismissing the fear of life-changing financial liability as "being spooked" is not helpful.
It’s an unfounded fear though, so my language is fair. You’d use the same language about any other unfounded fear.
This is the crux of the point. People are scared, and I get why. But it’s completely unfounded. If people still want to discriminate against UK IPs then that’s their choice as they have to weigh up the risks as they perceive them. But it doesn’t mean it’s any likelier to happen than, for example, being in a plane crash (to cite another fear people overcome daily).
———
That all said, maybe everyone blocking UK IPs could be a good thing. If everyone shows they don’t consider it safe to operate in the UK then our government might consider revoking this stupid law.
A lot of misplaced fear and over-reactions. For instance, lobste.rs could basically safely ignore the whole thing being a small, low risk forum based in the US.
> It took a fair amount of legal analysis to establish blog comments are OK (and its not clear whether off topic ones are)
It looks like it only took someone to actually read the Online Safety Act, as Ofcom's reply kindly points to the section that quite explicitly answers the question.
I don't think that the Online Safety Act is a good development but many of the reactions are over the top or FUD, frankly...
Have the court filings become available?
Of course, the random PR in the OP isn't going to go through their barrister's arguments.
While I agree that the main thrust of the legislation won't be affected either way, the regulatory framework really matters for this sort of thing.
Plus, win or lose, this will shine a light on some the stupidity of the legislation. Lots of random Wikipedia articles would offend the puritans.
> The Wikimedia Foundation shares the UK government’s commitment to promoting online environments where everyone can safely participate. The organization is not bringing a general challenge to the OSA as a whole, nor to the existence of the Category 1 duties themselves. Rather, the legal challenge focuses solely on the new Categorisation Regulations that risk imposing Category 1 duties (the OSA’s most stringent obligations) on Wikipedia.
Seems to require an algorithmic feed to be Category 1 - https://www.legislation.gov.uk/ukdsi/2025/9780348267174
Of course if you get your news from HN then the motivation is actually something to do with limiting discussion of immigration or being dystopian just because.
But yes, if they could just name Instagram and TikTok they probably would.
And moreover: WF's special pleading is[1], paraphrased, "because we already strongly moderate in exactly the ways this government wants, so there's no need to regulate *us* in particular". That's capitulation; or, they were never really adverse in the first place.
Wikimedia's counsel is of course pleading Wikimedia's own interests[2]. Their interests are not the same as the public's interest. Don't confuse ourselves: if you are not a centimillionaire entity with sacks full of lawyers, you are not Wikimedia Foundation's peer group.
[0] ("It’s the only top-ten website operated by a non-profit and one of the highest-quality datasets used in training Large Language Models (LLMs)"—to the extent anyone parses that as virtuous)
[1] ("These volunteers set and enforce policies to ensure that information on the platform is fact-based, neutral, and attributed to reliable sources.")
[2] ("The organization is not bringing a general challenge to the OSA as a whole, nor to the existence of the Category 1 duties themselves. Rather, the legal challenge focuses solely on the new Categorisation Regulations that risk imposing Category 1 duties (the OSA’s most stringent obligations) on Wikipedia.")
The law has passed, Wikipedia has to enforce that law but don’t wish to because of privacy concerns.
What should Wikimedia now do? Give up? Ignore the laws of the UK? Shutdown in the UK? What exactly are the options for wikimedia?
https://news.ycombinator.com/item?id=3477966 ("Wikipedia blackout page (wikipedia.org)" (2012))
Wikimedia weren't always a giant ambulating pile of cash; they used to be activists. Long ago.
Your point is moot because this wasn’t a WMF initiative, it was an enwiki community initiative which WMF agreed to accommodate.
The history is detailed… on Wikipedia… https://en.wikipedia.org/wiki/Protests_against_SOPA_and_PIPA...
And after the grace period... yeah, I think blocking UK IPs is the "correct" thing to do. If the government doesn't make them an exception than they'll have to do that, correct or not, anyway.
UK is a representative democracy meaning that voters get a voice every X years to vote for a representative that they assume will act in their favour and on their behalf.
What this representative does in their time in power is very much left to the representative and not the voters.
On the other hand, if this were to be a direct democracy then the voters would have been asked before this law was voted on. For example, a referendum might well have been held.
Perhaps a more nuanced approach would be to block all IPs of government organisations - difficult but far more approriate.
If UK wants to be more like China: let them.
That might actually be one of the few things that would help.
Yes. This is what every single large company which is subject to this distopian law should do. They should do everything they can to block any traffic from the UK, until the law is repelled.
By imposing costs and risk on self hosting, and reducing the number of supplies (because many small and medium companies and organisation will block the UK), it reduces competition.
There was a study by Amazon [1] that showed that every 100ms of extra load time of a page cost 1% of revenue. How much revenue do you think adding an ID verification that takes 10 minutes to complete cost???
You think PornHub loves this law???
[1] https://www.conductor.com/academy/page-speed-resources/faq/a...
A lot of PornHub's content comes from independants, not big studios.
There must be smaller sites in the same business that will block UK users rather than comply.
So, they very likely do.
What is more important is that the tech giants, and social media in particular does love this law. As I pointed out in another comment, and has been reported many times on HN before, they have already gained users as people switch from independent forums to social media, and in the future it will keep competition out.
So it would be interesting to understand if shutting down in the UK would have an impact, now we all had to learn how to circumvent georestrictions this past week.
https://avpassociation.com/4271-2/
The USA has twenty-fucking-five different laws we might be bound by, and AFAIK the silliest one (Texas) has been upheld by the USSC.
Look I get it, Hacker News has a no-politics-unless-it's-the-EU-or-UK rule and HNers generally seem to hate Brits.
But I think what we're witnessing here is little more than performative self-soothing. The entire foundations of US freedom are being ripped apart in an incredibly short time so hey, let's snark at the perfidious Brits.
It might be a bit disruptive in the beginning, but in the long run I think we all benefit from that. It increases the chance of politicians to realize their over-boarding decisions by having public pressure from previous users of those services and it increases the likelihood of local competitors of those services opening.
The fabric of the USA is being ripped apart by a kleptocratic authoritarian fascist-at-least-wannabe government that makes the most extreme country in the EU (Hungary) look exactly like a trial run, and you guys are worried about the Brits implementing a relatively measured law that affects fewer people than all those US porn laws combined.
HN's weird little "no politics" bubble encourages you all to think that it is outrageous that US companies should be held accountable to the laws of the countries in which you trade [0], while your president is, for example, imposing actually illegal tariffs on Brazil, abusing a power you won't take away from him, because they insist on prosecuting Bolsonaro under their own laws for something he did within their country.
Yes: we made a law you don't like. It's a stupid law. It's still a fairly measured, stupid law compared to the ones your states are passing and your own supreme court thinks are A-OK, or the silly one in France, or whatever.
Collectively you should maybe stop fretting about the UK while your country is reverting to quasi-monarchy.
[0] and yes, you are trading here if you serve porn to UK customers. This is the same standard as the US Supreme Court-approved Texas anti-porn law applies.
I'll fret about both, thank you very much.
That HN collapses into hysteria over the slightest thing happening on the other side of the Atlantic while studiously avoiding any discussion of homegrown political insanity is basically laughable. You know nothing about us; as always we are essentially forced to know all about you in detail so we can fend it all off.
Rolling over for it isn't going to do the EU or other allies any favors. The administration won't reward loyalty with good deals or whatever
However, I don't see what the legal basis of Wikimedia's challenge is. The OSA is primary legislation, so can't be challenged except under the HRA, which I don't really see working. The regulations are secondary regulation and are more open to challenge, but it's not clear what the basis of the challenge is. Are they saying the regulations are outside the scope of the statutory authority (doubtful)? You can't really challenge law or regulation in the UK on the basis of "I don't like it".
To continue the thought experiment though: another implementation would be to list up to N tags that best describe the content being served. You could base these on various agreed tagging systems such as UN ISIC tagging (6010 Broadcasting Pop Music) or UDC, the successor to the Dewey Decimal System (657 Accountancy, 797 Water Sports etc.) The more popular sites could just grandfather in their own tag zoologies.
A cartoon song about wind surfing:
X-Content-Tags: ISIC:6010 UDC:797 YouTube:KidsTV
It’s then up to the recipient’s device to warn them of incoming illegal-in-your-state content.That's no different to the current legislation.
Guys, this right here is Wikipedia standing. It is that under the current law, Wikipedia would fall under cat 1 rules, even if by the law own admission it should not.
https://www.gbnews.com/politics/labour-ban-vpn-online-safety...
- Labour have made no plans to ban VPNs.
- One MP wanted to add a clause for a government review into the impact of VPNs on the bill after 6 months, with no direction on what that would mean.
- I have no idea if this clause actually got added, but it'd make sense. If you're going to introduce a stupid law you should at least plan to review if the stupid law is having any impact.
- GB news is bottom of the barrel propaganda.
thats government speak for deciding to do something about the VPN problem. because there is no way a commission will not find a good reason to ban VPNs when you reach that point, because you could argue they help avoid UK restrictions.
'Kyle told The Telegraph last week in a warning: "If platforms or sites signpost towards workarounds like VPNs, then that itself is a crime and will be tackled by these codes."'
https://www.tomsguide.com/computing/vpns/what-does-the-labou... :
"In 2022 when the Online Safety Act was being debated in Parliament, Labour explicitly brought up the subject of VPNs with MP Sarah Champion worried that children could use VPNs to access harmful content and bypass the measures of the Safety Act. "
https://www.independent.co.uk/news/uk/politics/vpns-online-s...
Sure. Nothing was said directly right now, but to just take Labour's word for it that they won't go further with these restrictions is really naive.
This told me all I needed to know about her level of understanding of complex topics. It only went downhill from there.
https://www.theregister.com/1999/01/15/france_to_end_severe_...
> Until 1996 anyone wishing to encrypt any document had to first receive an official sanction or risk fines from F6000 to F500,000 ($1000 to $89,300) and a 2-6 month jail term. Right now, apart from a handful of exemptions, any unauthorised use of encryption software is illegal.
These two former empires seem/seemed to have an over-inflated sense of importance and ability to control the world.
(and before that PGP!)
Zimmerman had a novel defense (selling PGP source code as a book, which should be protected by 1A), but it was never actually tested in court.
A Raspberry Pi outperforms a Cray-1 supercomputer, for instance.
It wasn’t relevant to any Apple ads.
"The Home Secretary's husband has said sorry for embarrassing his wife after two adult films were viewed at their home, then claimed for on expenses."
The follow up article has some fun nuggets too http://news.bbc.co.uk/2/hi/8145935.stm
If anything, greater intelligence would only accelerate the damage and persuasiveness behind its public consent.
I find this a very unprincipled stance.
https://www.edinburghnews.scotsman.com/news/uk/online-safety...
And if it marginally is, how come they cannot just turn off their "content recommender system"? Perhaps an example is the auto-generated "Related articles" that appear in the footer on mobile only?
[1] https://www.legislation.gov.uk/uksi/2025/226/regulation/3/ma...
Children are using mobiles and tablets almost exclusively, both major providers of which supply tools for parental administration.
Content filtering is already facilitated by existing parental control. Mobile browsers could be made to issue a header if the user is under a certain age. Mobile apps could have access to a flag.
Parents should be responsible for parenting their child - not big tech. Why does it need to be any more complicated than that?
And THAT is the problem that they should be tackling.
Either way, if parents had more time to raise their children rather than slave away at jobs to stay above water, I have to think there'd be some improvement in child development.
I seriously doubt that the majority of parents want the state to raise their children for them.
By arguing about irresponsible or lazy parents you are latching on to the first, most convenient thing that seems to make sense to you. But I think that is a mistake because not only does it perpetuate some kind of distorted sense of reality where parents don't care about their children and want to hand off all responsibility for them, but it distracts you from the real causal issues.
The fact is that humans have for millions of years acted in various levels of coordination to raise and look after children as a group. Modern society has made this all sorts of dysfunctional, but it still exists.
This is normal and what public education is for. Teaching online safety and sex ed should be considered no different than teaching history
Long working hours and both parents working full time means they do not have the time or the energy. Then you have the state offering help, and encouraging parents to drop them off at school first thing for breakfast club, and then keep them there for after school activities.
That would be the ideal. Unfortunately, many parents do not have the skills and/or motivation to manage their children's devices.
My parents for a long time used their neighbor's wifi, despite having their own, because they didn't remember the password.
That said, having the carrier assign certain devices marked as "child" or "adult" or even with a DoB stamp that would change the flag when they became an adult might not be a bad thing. While intrusive would still be better than the forced ID path that some states and countries are striving towards.
That said, it could definitely be done relatively easily at the carrier level, and a really simple addition to browsers, even if only on mobile devices.
Lots of things that feel relatively common online feel like they would be very alien and weird situations if they happened offline.
So I'm actually against this because I hate the indirect hypocrisy. I want to teach a lesson to Republicans about using overly generalized principals as a political stance.
It's a distraction.
Real objective is to further increase the barrier of entry for SMEs to compete (try start your own forum or any kind of challenger to Facebook et al). Government on the other hand gets a tidy surveillance tool as a sweetener.
So whenever time comes to turn a screw on dissent, the law is ready to be used.
Welcome to British corporate fascism.
They are an extreme minority of every population (mostly people who aren't interested in politics or civil liberties who enjoy and care about children.) But sensible people are also an extreme minority of the population; we normal people usually aren't so sensible, instead we listen to sensible people and follow their advice.
So the people who want everybody on the internet to identify themselves pit hysterics against measured voices in the media, in order to create a fake controversy that only has to last until the law gets passed. Afterwards, the politicians and commentariat who were directly paid or found personal brand benefit in associating with the hysterics start leaving quotes like: "This isn't what we thought we passed" and "It might be useful to have a review to see if this has gone too far." Then we find out that half the politicians connected with the legislation have connections to an age verification firm which is also an data broker, and has half a billion in contracts with the MoD.
a) Content controls don't work, what are the government thinking? b) This is parents' problem, they should use content controls.
Individual action doesn't work because it only takes one kid in the class who doesn't have parental controls then everyone loses. There's also obvious workarounds such as VPNs and a teenager walking into a pawn shop with £50 for a second hand smartphone without parental controls.
It also makes no sense that parents can't be bothered to turn on parental controls yet can be bothered to run a national grassroots campaign for this stuff (see e.g. http://smartphonefreechildhood.org)
See also- I Had a Helicopter Mom. I Found Pornhub Anyway: https://www.thefp.com/p/why-are-our-fourth-graders-on-pornhu... 8-year old watches violent porn on friend’s iPad: https://www.thesun.co.uk/fabulous/32857335/son-watched-viole...
Although your idea of an OS-level age flag is also being pushed by the Anxious Generation's Jonathan Haidt, so definitely has merit/traction as an alternative.
I don't think my parents had realized that scene was in the book. But I don't think it matters that much. Kids are going to encounter sex. In a pre-industrial society, it's pretty likely that children would catch adults having sex at some point during their childhood -- even assuming they didn't see their own parents doing it at a very young age. Privacy used to be more difficult. Houses often had one bedroom.
I don't mean to say that content controls are useless. I think it was probably for the better that I wasn't watching tons of porn in middle school. But I don't think that content controls need to be perfect; we don't need to ensure that the kids are never exposed to any pornographic content. As long as it isn't so accessible that the kid is viewing it regularly, it probably isn't the end of the world. Like in the one story, PornHub didn't even have a checkbox to ask if you were eighteen. Just don't do that. I didn't end up downloading porn intentionally myself until about five years after reading that book.
Because the government is lying and this is about spying on the populace, not about parental control.
https://successfulsoftware.net/2025/07/29/the-online-safety-...
If Operating Systems had a way for parents to adequately monitor/administer the machines of their children, this would not be such a huge, massive hole, in which to pour (yet more) human rights abuses.
Parents have the right to have an eye on their children. This is not repressive, it is not authoritarian, it is a right and a responsibility.
The fact that I can't - easily, and with little fuss - quickly see what my kids are viewing on their screens, is the issue.
Sure, children have the right to privacy - but it is their parents who should provide it to them. Not just the state, but the parents. And certainly, the state should not be eliminating the rest of society's privacy in the rush to prevent parents from having oversight of - and responsibility for - the online activities of their children.
The fact is, Operating System Vendors would rather turn their platforms into ad-vending machines, than actually improve the means by which the computers are operated by their users.
It would be a simple thing to establish parent/child relationship security between not just two computers, but two human beings who love and trust each other.
Kids will always be inquisitive. They will always try to exceed the limits imposed upon them by their parents. But this should not be a reason for more draconian control over consenting adults, or indeed individual adults. It should be a motivating factor to build better computing platforms, which can be reliably configured to prevent porn from having the detrimental impact many controllers of society have decided is occurring.
Another undeniable fact, is that parents - and parenting - get a bad rap. However, if a parent and child love and trust each other, having the ability to quickly observe the kids computing environment in productive ways, should be being provided, technologically.
When really, we should be building tools which strengthen parent/child relationships, we are instead eradicating the need for parents.
Unpopular opinion, I know: but Thats The Point.
Unless you've got some specific better examples?
IlikeKitties•10h ago
Every Company that implemented any compliance is a traitor to the free internet and should be treated as such.
lukan•9h ago
What would be the punishment for that?
IlikeKitties•9h ago
On a legal level? None. On a personal level? Don't give them money or your business. Avoid them completely or ensure you use ad blockers on their sites and throw away accounts if necessary. Do not contribute to their content.
In short: you take whatever they give you, and you give nothing in return.
blitzar•9h ago
https://en.wikipedia.org/wiki/Foot_voting
Swinx43•9h ago
A UK internet blockade might just get this going.
IlikeKitties•9h ago
drcongo•7h ago
IlikeKitties•7h ago
drcongo•5h ago
VikingMiner•6h ago
It is known colloquially as "Pulling up the ladder behind you".
https://en.wikipedia.org/wiki/I%27m_alright,_Jack
blitzar•9h ago
While the law would not be toppled tomorrow, the companies of the internet need to stop being so desperate for small scraps of money and eyeballs.
The internet might be free if companies instead of trying to skirt laws and regulations just operated where they are welcome. Good for the internet but bad for the VCs so it wont happen.
graemep•8h ago
ajsnigrutin•8h ago
Cutting off UK for a few weeks won't cause that much damage but might help them in the long run.
wizzwizz4•8h ago
xnorswap•7h ago
But of course Meta carved out their own exception in the law, so this law benefits Meta at the cost of alternatives.
captainbland•6h ago
It's like the government thought long and hard about how to make the restrictions the most inconvenient and with the largest number of gaps in the approach.
hermitcrab•5h ago
Companies can be fined £18 million pounds or 10% of revenue, whichever is greater. If you feel like being the first test case, be my guest.
IlikeKitties•4h ago
hermitcrab•2h ago