That said, I think it’d be smarter of the GOP to let California do just that. It’s a chance to move that tech money out of California and into another more regulation friendly state.
This is the same state that banned plastic bags to "save the environment" - did they mandate paper bags then? Renewable, compostable, organic paper? No! They allowed plastic bags to be replaced with... Super thick plastic bags! Which I assure you, stores go through at least 80% as many as before because people usually don't bring bags, but now they're 4-5x the plastic.
And they added a ton of regulation on straws based on that literal child's insane napkin math that went viral, that claimed that America uses 7 or 8 straws per man, woman, and child, per day. Now we get to use multiple paper straws that dissolve in your cup immediately.
California is awash in best-intentions, but utterly useless and counterproductive, regulation. Just another downside to one-party rule. Neither party does a good job with zero counterbalance to their power and ideas.
https://skeptoid.com/episodes/4460
"When the UK Environment Agency did a life cycle assessment of supermarket carrier bags (PDF) they found that non-woven polypropylene bags needed to be re-used at least 11 times to have lower global warming potential than single-use HDPE, or High-Density Poly-Ethylene, bags. Cotton shopping bags need to be used at least 131 times. Paper bags were the big losers. They aren't likely to survive the 4 uses needed to reach the same global warming potential, but are much more toxic to produce than plastic."
11 uses out of a reusable bag is not a tough threshold to hit. I've got one I know is from 2018 in daily use still and has crossed the Pacific several times.
> no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act
States never got to control Federal spending, AI or otherwise.
But the Tenth Amendment pretty strictly limits how much the Feds can control state spending and legislation, too.
This is a straightforward declaration of Commerce Clause authority. This SCOTUS has made it clear the “Dormant Commerce Clause” is not stirring awake, so if Congress wants to preempt state regulation of interstate commerce they have to do so explicitly.
Reading basic history shows it's always been this way. As a simple historical example the soon to be Confederate states complained about "state's rights" for slavery but when they seceded they enshrined slavery in their constitution and notably didn't leave it up to their states (so clearly that institution was more important to them than state autonomy). It's always been a convenient veneer over policy.
Const. of C.S.A. art. I, § 9, ¶ 4 restricted their federal legislature's power:
> No bill of attainder, ex post facto law, or law denying or impairing the right of property in negro slaves shall be passed.
The next section similarly restricted the states' power to "pass any bill of attainder, or ex post facto law" but did not reference slavery.
https://www.thenation.com/article/archive/exclusive-lee-atwa...
But now that you have let me know I am racist and transphobic because I believe the 10th amendment exists, I've got to do some soul searching to do. My whole life is a lie, will someone let Pennsylvania know gently they don't have rights?
But the political slogan "states' rights" has historically significant usage and connotations that go far beyond that simple fact.
For example, states can allow people to drink under age 21, on the interstate highways they own.
But the Feds can refuse to pay for the highways if they do.
Seems much less relevant to me but maybe my thinking is too small minded (probably because I don't believe LLMs are a path to AGI).
This is an outright lie. The relevant bit of legislation is cited in the article:
"no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act"
> States are free to raise their own taxes and spend them how they see fit.
The language above very clearly forbids them from spending said tax revenue on regulating AI.
The same folks have long been salty that California sets higher standards for vehicle emissions (https://en.wikipedia.org/wiki/Emission_standard#State-level_...) and are looking to kneecap that sort of action here.
We have an opportunity here to set rules that cars should yield to rapid transit public buses, that vehicles should behave in ways to increase the flow of traffic, etc etc... there are many options for setting rules that autonomous vehicles must follow which is in the best interests of the public not just the rider.
Correct, the Republican Party does not want anyone to be able to regulate that.
> It's just going to be everyone for themselves, the vehicles will just follow rules meant for humans?
The vehicles will follow whatever rules are in the best interest of the corporations that made them.
I think the simplest, clearest way to interpret this legislation is that it's a straight transfer of power from individual citizens to AI corporations.
I can only see this working if we jump straight to 100% self-driving. Otherwise, you'll have to make transitory guidelines for drivers without driverless tech, such as "yield in x situation when you see the rapid transit public bus." But if you do this, you're making the driving rules more complex and less predictable. That means you're creating more dangerous situations for drivers.
But of course, we're not going to go straight to 100% driverless. We're going to have some portion of people driving their own cars for a long time, especially in the USA.
So if a bank has an automated loan approval system that consists of a series of IF-THEN statements, and one of those statements amounts to IF (applicant.race != "White"), loan.reject; this ban would forbid a state from taking action?
> New York's 2021 law mandating bias audits for AI tools used in hiring decisions would also be affected, 404 Media notes.
Suppose the bill said "no laws about horses!". Okay then if you want to make a law regulating the manufacture of horse shoes, you target the law to "odd-toed ungulates" instead.
Seems trivial to work around since there is no legal definition of AI.
Instead of making your law specific to AI system, you can simply make it slightly broader in scope so it includes AI systems in practice.
For example, prohibition on AI facial recognition in public spaces -> prohibition on any computerized facial recognition
https://www.theguardian.com/us-news/2023/aug/21/artificial-i...
It's not like the laws prohibit any use of AI, it's literally basic safeguards and human in the loop provisions but the text of the bill as written would make those laws illegal.
Which is not surpsing considering it comes coupled with massive cuts in Medicaid - private Medicaid plans are some of the most egregious players in terms of denials.
Here is a simple website which uses the 5calls API to get your reps and gives you a script to talk to them about this https://www.deny-ai.com/call-your-representatives
I'm asking, because my take is that totally unregulated AI will sooner or later lead to such applications. And you can't really advocate that privacy laws will stop that - after, that would hinder the progress of things like "automated decision systems".
Federal law might supersede state law in areas where the federal government has express powers, e.g. interstate commerce, but if a state is adding AI-related provisions to existing policy in an area it already has authority over, I can't imagine how Congress could attempt to suppress that.
Sure, federal law could likely supersede state law if a state is trying to restrict AI as a commercial service in itself, as that would cross into interstate commerce territory. But if a state already has regulatory authority over e.g. how insurance companies operate within their jurisdiction, adding provisions that relate to how AI is used in the process of providing insurance coverage doesn't seem like something the Congress could legitimately intervene in.
henning•5h ago
unsnap_biceps•5h ago
dudeinjapan•5h ago
magicalist•5h ago
Almost certainly yes. The provision defines it as
> The term "automated decision system" means any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues a simplified output, including a score, classification, or recommendation, to materially influence or replace human decision making.
https://d1dth6e84htgma.cloudfront.net/Subtitle_C_Communicati...
TrackerFF•4h ago