I ask this in good faith, because my observation of the last few years is that the incidents still occur, with all of the harms to individuals also occurring. Then, after N number of incidents, the company pays a fine*, and the company does not necessarily make substantive changes. Superficial changes, but not always meaningful changes that would prevent future harms to individuals.
*Do these fines tend to be used to compensate the affected individuals? I am not educated on that detail, and would appreciate info from someone who is.
Regulations never prevent stuff happening. They offer recompense when they do. Laws don't either.
In terms of distribution of fines, it is rare.
The only exception I know of, for which there is some regulation where they can't just say "no", legally, are banks. And trust me, if banks don't want you as a customer they will do everything in their power to maliciously comply to the point your account is useless and perma frozen.
What is this lunacy about Google regulation about? If Google doesn't want Enderman, you can't force them to have him.
I get what you really mean is regulating so companies are forced to process and communicate via non-automated, non-AI systems for whatever a, b, c issue or reason, but this doesn't change anything because of how simple and cheap is malicious compliance.
All Google needs to do is "yeah, okay, we'll also review it with human", and put some intern to press a green button manually.
Unless you can prove discrimination, it's their house, it's their business, they can and should do what they want.
The issue is that Youtube is one of the strongest and hardest to break monopolies on the internet. It's the hardest part of the degoogling process.
That's just not true.
Up till now, no government has (to my knowledge) tried to dictate to a major American platform owner that they may not ban certain users or classes of users, but that doesn't mean that they can't.
It's really not the same thing as the issue of forcing an employer to rehire an illegally-fired employee—where the employee then remains there under a cloud, because they have to continually interact with the people who wanted them gone. In 99.999% of cases, when a platform removes a user, there's zero relationship between that user and the people involved in making that decision.
If Congress made a law tomorrow (laughable in the current environment, I know) that said that any public video platform provider with over X users couldn't ban anyone except for specific reasons, then YouTube would, indeed, have to keep such people on their platform.
Prove me the contrary: find me a single law that forces any business to have business with any other, regardless of them wanting to or not.
I'm 100% sure nobody can force me to do business with people I don't want and if you're a professional I can't force you either to do business with me. Why would you think this would be a good law to have? Only discrimination would be a valid reason.
If Google (business) doesn't want to platform a creator (another business), that's their right.
Of course we can question the morale or ethics, but that's about it.
> If Congress made a law tomorrow (laughable in the current environment, I know) that said that any public video platform provider with over X users couldn't ban anyone except for specific reasons, then YouTube would, indeed, have to keep such people on their platform.
But such laws do not exist in pretty much any part of the world: you can't force a business (Youtube) to do business with another one (a creator).
The reason why this is obviously different is because Youtube is a de facto monopoly on large parts of internet content.
https://en.wikipedia.org/wiki/List_of_anti-discrimination_la...
Sorry, but either you've phrased yourself poorly for what you actually want to say, or you're genuinely unaware of the many anti-discrimination laws in the US, a substantial number of which explicitly prohibit businesses from refusing service to people in protected categories.
If digikey decides they don't want to do business with me, I am not suddenly unable to buy from 30% of the world's manufacturers, unable to sell to 70% of my customers and locked out of my manufacturing line's plc.
If Safeway decides to decline my business, I am not locked out of eating bread from anyone who buys their flour from them.
If Cocacola doesn't want to renew our contract because I mentioned to my customers that we also stock Pepsi, I can still buy Cocacola from the wholesaler and resell it, and regardless I don't lose access to my accountant and mailbox when they terminate that relationship.
That I agree 100%.
But Youtube really did nothing to become or preserve its monopoly really. It's really a reinforcing most creators -> most users -> most money -> most creators -> most users.
This is demonstrably false.
Where I live, stores aren't allowed to refuse a sale under most circumstances (barring some specifically-listed exceptions like selling alcohol to minors). Same for schools, we don't have a concept of "expulsion" unless it's court-mandated. There's no reason a similar regulation couldn't be applied to digital platforms.
Whether such a regulation should exist is a different matter entirely. Fighting fraud and scams is difficult enough already, making them harder to fight means we get more of them. Either that, or Google starts demanding rigorous ID verification from everybody who wants a Youtube channel.
That's not only true for B2C, as most codexes have at best laws about public utilities (you can't be denied electricity for no reason), sometimes banks, and sometimes regulated professionals (lawyers, insurers, etc).
This is particularly true for B2B, as Youtube and creators transactions are.
That could be as simple as a database lookup against flagged accounts or a simple heuristic score.
We're over-AI-ing everything.
I'm always confused because they've seen special effects for decades, and now the very same explosions, filters, etc, cgi is suddenly AI.
It's automated. It's based on information. Why is it not intelligent. Why is it demonstrably less intelligent than an LLM which may make no attempt to retrieve information from other sources but merely has what it was created with.
Heuristics are just rules of thumb without necessarily having a rigid law or clean classification.
You can derive heuristics from mathematically modeling something or even applying machine learning, but they need not necessarily involve either set of techniques.
I don't love the way language around this is evolving as it is mostly a marketing tool to make these tools seem much more than they are. Primarily this is driven by the current generative "AI" bubble
Don't get me wrong, I'm not defending Youtube's behavior here. It's bad and shouldn't just be shrugged off. I just don't think that shouting "monopoly!" actually fixes anything. If you want a video hosting and streaming site that has less market dominance and better moderation policies, that already exists. Everyone is free to use them.
That's very much the point: collaring and tranquilizing the 900 pound gorilla in the room so that the reasons people might have to interact with the 30 other monkeys become relevant.
You need to address the underlying causes of this kind of behavior.
Some of that would be alleviated if we separated hosting/serving videos from the frontend and indexing, perhaps with a radio-like agreement on what the host gets paid for serving the video to a customer of the frontend. Frontend/index makes money off ads, and then pays some of that back to the host. Creators could in theory be paid by the video hosts, since views make the host money.
Then heavy handed moderation could be a disadvantage then, because they would be lacking content other sites have (though some of that content would be distasteful enough most frontends would ban it).
Yes, they shouldn't be dependent on Alphabet, they should back up their content and diversify platforms, but because we decided to allow monopolization of monetization of the web, and to vigorously encourage the surveillance based adtech of Google and Facebook, they control the full stack and effectively hold audiences hostage; you have to play on their platforms in order to engage with the audience you build, and a vast majority of the consumers of content are ignorant of the roles platforms play. If you leave the platform, you lose the access; if you have multiple channels, you get shadowbans and other soft-penalties to discourage people from being disloyal to Google.
We should have a massive diversity of federated, decentralized platforms, with compatible protocols and tools. People should have to think about CDNs and platforms as little as they think about what particular ISP is carrying their traffic between a server and their home.
There should be a digital bill of rights that curtails the power of platforms in controlling access, reach, and forces interoperability, and eliminates arbitrary algorithmic enforcement, and allow due process with mandatory backout periods giving people the reasonable opportunity to recover digital assets, communicate with audience, and migrate to a new platform.
The status quo is entirely untenable; these companies should not have the power to so casually and arbitrarily destroy people's livelihoods.
This is only the beginning of fucking around and finding out how putting "AI" into everything will create all kinds of problems for humanity.
Relevant Idiocracy clip:
gdulli•2h ago
PyWoody•1h ago
chithanh•32m ago