This is so short sighted... autonomous vehicles including buses and trucks are on their way to our streets. We don't want to create rules and govern how this is going to work on our public roads? It's just going to be everyone for themselves, the vehicles will just follow rules meant for humans?
We have an opportunity here to set rules that cars should yield to rapid transit public buses, that vehicles should behave in ways to increase the flow of traffic, etc etc... there are many options for setting rules that autonomous vehicles must follow which is in the best interests of the public not just the rider.
> We don't want to create rules and govern how this is going to work on our public roads?
Correct, the Republican Party does not want anyone to be able to regulate that.
> It's just going to be everyone for themselves, the vehicles will just follow rules meant for humans?
The vehicles will follow whatever rules are in the best interest of the corporations that made them.
I think the simplest, clearest way to interpret this legislation is that it's a straight transfer of power from individual citizens to AI corporations.
> We have an opportunity here to set rules that cars should yield to rapid transit public buses, that vehicles should behave in ways to increase the flow of traffic, etc etc...
I can only see this working if we jump straight to 100% self-driving. Otherwise, you'll have to make transitory guidelines for drivers without driverless tech, such as "yield in x situation when you see the rapid transit public bus." But if you do this, you're making the driving rules more complex and less predictable. That means you're creating more dangerous situations for drivers.
But of course, we're not going to go straight to 100% driverless. We're going to have some portion of people driving their own cars for a long time, especially in the USA.
I don't even necessarily agree with them, but what's so terrible about that opinion that you'd say it's awful and should be flagged and legislated against?
So if a bank has an automated loan approval system that consists of a series of IF-THEN statements, and one of those statements amounts to IF (applicant.race != "White"), loan.reject; this ban would forbid a state from taking action?
How could you prove discrimination if you can't audit the decision making system? The article mentions NY regulations regarding bias audits in systems used to make hiring decisions as one casualty of this new law.
> New York's 2021 law mandating bias audits for AI tools used in hiring decisions would also be affected, 404 Media notes.
Wouldn't that changed law encompass the automated software as well, thereby being in violation? I am not a lawyer, but the ridiculously broad language in the spending bill doesn't seem to leave much room for that sort of thing.
The spending bill thing says the law can't target X, so you target a superset of X instead.
Suppose the bill said "no laws about horses!". Okay then if you want to make a law regulating the manufacture of horse shoes, you target the law to "odd-toed ungulates" instead.
Is this constitutional? It sounds like a pretty clear breach of the anti-commandeering doctrine. The federal government can't simply issue commands to state legislatures.
Federal law might supersede state law in areas where the federal government has express powers, e.g. interstate commerce, but if a state is adding AI-related provisions to existing policy in an area it already has authority over, I can't imagine how Congress could attempt to suppress that.
Sure, federal law could likely supersede state law if a state is trying to restrict AI as a commercial service in itself, as that would cross into interstate commerce territory. But if a state already has regulatory authority over e.g. how insurance companies operate within their jurisdiction, adding provisions that relate to how AI is used in the process of providing insurance coverage doesn't seem like something the Congress could legitimately intervene in.
This is going to be disastrous for hospitals and doctors, because they're facing a massive surge of (likely AI powered) denials and individual states are regulating it - this would ban that.
It's not like the laws prohibit any use of AI, it's literally basic safeguards and human in the loop provisions but the text of the bill as written would make those laws illegal.
Which is not surpsing considering it comes coupled with massive cuts in Medicaid - private Medicaid plans are some of the most egregious players in terms of denials.
The court system is responsible for interpreting law so playing semantic games doesn't work very well unless you have a corrupt judge who sides with you and is willing to interpret it in this way
I don't think it'd be difficult to blur the lines in terms of implementation details in order to "AI launder" desired functionality in such a manner where the "AI" distinction becomes a philosophical debate between SMEs.
I think you can blur the lines to an extent and if the judge isn't very capable, pull one over on them, but to say an LLM or ML model is not artificial intelligence to a capable judge with a prosecution who points out there is an LLM, it would be hard to get around imo
I think it's good to realize that many people's commitment to "American" values is weak at best. Things like state's rights, equal representation in government, and even "freedom of speech" are often political tools rather than actual values.
Reading basic history shows it's always been this way. As a simple historical example the soon to be Confederate states complained about "state's rights" for slavery but when they seceded they enshrined slavery in their constitution and notably didn't leave it up to their states (so clearly that institution was more important to them than state autonomy). It's always been a convenient veneer over policy.
The point they are making is that for decades GOP would cry states' rights whenever Democrats did something at the federal level but whenever they are in power, states' rights suddenly don't matter.
That's not possible. The Republican propaganda machine is very effective and there are many supporters who will vote with the GOP because "states' rights" but will simultaneously vote for all these policies. I don't think the fiercely logical part of HN understands politics and thinks you can just out debate someone into changing their mind but if it was that simple, the GOP would've died out decades ago.
People who love dunking on conservatives on twitter don’t realize that saying absurd stuff and then watching liberals wind themselves into a tizzy is like 95% of the GOP’s media strategy.
You're being downvoted but you're right and if people spent any time in conservative spaces they would know this. "Owning the libs" has become a meme but this is the point to a lot of what Republicans do.
This is far too sweeping, but when you have California seemingly intent on smothering our AI industry in its crib it makes sense that they’re scared.
That said, I think it’d be smarter of the GOP to let California do just that. It’s a chance to move that tech money out of California and into another more regulation friendly state.
Waymo seems to be operating smoothly in San Francisco. OpenAI's headquarters are also there. Many AI startups are also based in San Francisco, California.
Right, but you might want to look at the regulations their lawmakers have been proposing. If put in place it would put a stop to that pretty much immediately.
If you could get the whole world on board, sure. However, many other countries aren't going to play ball. When there's a chance (even if it's tiny) that AGI is coming that becomes a huge matter of national security, nevermind economic dominance.
California is not a reliable entity to entrust with such lofty ambitions.
This is the same state that banned plastic bags to "save the environment" - did they mandate paper bags then? Renewable, compostable, organic paper? No! They allowed plastic bags to be replaced with... Super thick plastic bags! Which I assure you, stores go through at least 80% as many as before because people usually don't bring bags, but now they're 4-5x the plastic.
And they added a ton of regulation on straws based on that literal child's insane napkin math that went viral, that claimed that America uses 7 or 8 straws per man, woman, and child, per day. Now we get to use multiple paper straws that dissolve in your cup immediately.
California is awash in best-intentions, but utterly useless and counterproductive, regulation. Just another downside to one-party rule. Neither party does a good job with zero counterbalance to their power and ideas.
> This is the same state that banned plastic bags to "save the environment" - did they mandate paper bags then? Renewable, compostable, organic paper? No!
"When the UK Environment Agency did a life cycle assessment of supermarket carrier bags (PDF) they found that non-woven polypropylene bags needed to be re-used at least 11 times to have lower global warming potential than single-use HDPE, or High-Density Poly-Ethylene, bags. Cotton shopping bags need to be used at least 131 times. Paper bags were the big losers. They aren't likely to survive the 4 uses needed to reach the same global warming potential, but are much more toxic to produce than plastic."
11 uses out of a reusable bag is not a tough threshold to hit. I've got one I know is from 2018 in daily use still and has crossed the Pacific several times.
"States's rights" has always been coded language. Lee Atwater's post Nixon interview gave away the playbook. The hypocrisy is easy to see in that lens. First it started with racial slurs, then welfare queens, racial slurs, big government, states rights, occasional "liberty and freedom" thrown in for good measure. Currently it's DEI and trans.
The states do have rights though, Madison wrote a federalist paper about it and these so called "States" even got their own amendment.
But now that you have let me know I am racist and transphobic because I believe the 10th amendment exists, I've got to do some soul searching to do. My whole life is a lie, will someone let Pennsylvania know gently they don't have rights?
> no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act
States never got to control Federal spending, AI or otherwise.
But the Tenth Amendment pretty strictly limits how much the Feds can control state spending and legislation, too.
This doesn’t have anything to do with the 10th Amendment (little does).
This is a straightforward declaration of Commerce Clause authority. This SCOTUS has made it clear the “Dormant Commerce Clause” is not stirring awake, so if Congress wants to preempt state regulation of interstate commerce they have to do so explicitly.
The Feds can't say "you can't regulate in a way we don't like" to the states. They can apply "reasonable conditions" along the lines of "if you do x we will take away related funding y" but the entire point of the Tenth Amendment is that states have more rights than the Feds unless otherwise stated.
Federalism as protected by the Tenth means California can require things like "may contain lead" on labels, even though some of those products may be sold outside of California.
The people that are championing this sort of stuff, what's your take on social credit systems (like in China), or just total surveillance?
I'm asking, because my take is that totally unregulated AI will sooner or later lead to such applications. And you can't really advocate that privacy laws will stop that - after, that would hinder the progress of things like "automated decision systems".
if you ever wanted to obliterate any consumer confidence in a market thats already routinely mocked, loathed and derided...i can think of no better way than to ensure it is fecklessly unaccountable to any sort of regulation.
What could possibly go wrong? is no longer an exclusion but an enumeration I guess. Everything you can think of probably will. Could things like this be repealed when someone who knows what they’re doing steps in?
Since it's a reconciliation bill, is this likely to make it past the "Byrd bath"? It's looped in with a $500M AI modernization fund but my simplified understanding is that items not related to budget can be challenged and removed. Couldn't find reference to this in any of a few news articles.
State and local governments cannot regulate. This means that the leader still can issue executive orders, e.g., against AI wokeness. Republicans very much wanted to regulate just one year ago:
Queue the usual remarks about "automated decision systems": is the PID controller in an espresso machine an automated decision system, is a pacemaker, Cochlear implant, fuzzy logic controller in a rice cooker, etc.
> Queue the usual remarks about "automated decision systems": is the PID controller in an espresso machine an automated decision system, is a pacemaker, Cochlear implant, fuzzy logic controller in a rice cooker, etc.
Almost certainly yes. The provision defines it as
> The term "automated decision system" means any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues a simplified output, including a score, classification, or recommendation, to materially influence or replace human decision making.
The big difference here is that leading AI companies are primarily data companies. If Amazon, Meta, Google, what have you decides they want to develop AI products for that insurance market, and they get to do so with zero oversight - in the name of "but we have to beat China!" - I can foresee a number of ways it will end up badly for the consumer.
The interstate highway system was considered to be allowed under the power to legislate for national defense. AI development doesn't seem to be less relevant to defense than roads.
That system depends on pulling funding for roads if they don’t follow the rules. Technically any state can opt out if they don’t receive any highway funding. Given the government isn’t giving large AI funding to states, they can’t do the same here.
The Commerce Clause has been read so broadly thanks to Wickard v Filburn that almost everything falls under it by default. The current Supreme Court seems at least skeptical of that interpretation but it is difficult to say if they will ever change it.
That's the broad interpretation they are talking about. The Commerce Clause isn't just "is it commerce". The federal government doesn't (didn't) have control of all commerce by default.
Establishing uniform standards for regulation of commerce is squarely within the wheelhouse of the Commerce Clause. It's not the stretch you're trying to portray it as.
It is federal government making rules about how federal money can be spent. Why is this wrong? States are free to raise their own taxes and spend them how they see fit. If they want federal funding, then they must cooperate with federal rules. Seems logical.
> It is federal government making rules about how federal money can be spent.
This is an outright lie. The relevant bit of legislation is cited in the article:
"no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act"
> States are free to raise their own taxes and spend them how they see fit.
The language above very clearly forbids them from spending said tax revenue on regulating AI.
You’re saying the same thing twice. One just more blatantly than the other. The point of calling someone a lunatic is to undermine their opinion. It’s ad hominem either way even if one is more subtle.
> The point of calling someone a lunatic is to undermine their opinion.
I don't see that as the point. I think they were just opining. There doesn't seem to be an argument they are making or refuting so this cannot, by definition, be an "argument ad hominem" (argument to the person).
The parent comment to yours nails it with the given examples. The first is the conclusion to arguments -- which they did not expound on -- "this opinion is lunacy"; and the second is specific argument that "you are a lunatic" and conclusion that therefore "your opinion is lunacy". That is an argument to the person.
(Not to speak to the quality of OP's comment for this message board. A comment explaining why they think it's crazy would be preferred.)
Idk what Libertarian dreamland you people live in. Go try and live in a country with no regulations for things like pollution, hazardous waste, aviation safety, etc, etc and let us know how long you last. Powerful technologies necessarily can be dangerous. AI is going to be a major change driver and honestly if you deny that it's an incredibly powerful technology then I don't know what else to say.
> Do you really think having 50 completely different regulatory regimes is somehow better?
Yes, that is a less bad problem than a law that bans any enforcement of such regulations. 50 different sets of regulation just means that companies will change policy to adhere to the most strict of them (presumably California's) and be done with it.
If you read the article, it is clear that it doesn't block any and all regulation of AI. It says states cannot make federal funding follow non-federal rules around AI. The federal government may actually have more regulations than states, and this would require states to do a better job.
IN GENERAL.-Except as provided in paragraph (2), no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence sys-tems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.
> The federal government may actually have more regulations than states
There would be a chance of this being true if the language didn't bar states from enforcing their own regulation; there's no point to that except for a worry that some states' regulation will be more strict.
I don't love heavy-handed approaches like this, but it does seem very in line with small government politics.
Essentially, this is saying that the executive can't create regulations that add regulations that limit what businesses can do (which would be relevant when the party in power of the executive changes)
The executive doesn't care what the law says. This law constrains the states. The opposite of what the GOP believes in almost every other things where the states are sufficiently -ist/-phobic.
This is so short sighted... autonomous vehicles including buses and trucks are on their way to our streets. We don't want to create rules and govern how this is going to work on our public roads? It's just going to be everyone for themselves, the vehicles will just follow rules meant for humans?
We have an opportunity here to set rules that cars should yield to rapid transit public buses, that vehicles should behave in ways to increase the flow of traffic, etc etc... there are many options for setting rules that autonomous vehicles must follow which is in the best interests of the public not just the rider.
> We don't want to create rules and govern how this is going to work on our public roads?
Correct, the Republican Party does not want anyone to be able to regulate that.
> It's just going to be everyone for themselves, the vehicles will just follow rules meant for humans?
The vehicles will follow whatever rules are in the best interest of the corporations that made them.
I think the simplest, clearest way to interpret this legislation is that it's a straight transfer of power from individual citizens to AI corporations.
> We have an opportunity here to set rules that cars should yield to rapid transit public buses, that vehicles should behave in ways to increase the flow of traffic, etc etc...
I can only see this working if we jump straight to 100% self-driving. Otherwise, you'll have to make transitory guidelines for drivers without driverless tech, such as "yield in x situation when you see the rapid transit public bus." But if you do this, you're making the driving rules more complex and less predictable. That means you're creating more dangerous situations for drivers.
But of course, we're not going to go straight to 100% driverless. We're going to have some portion of people driving their own cars for a long time, especially in the USA.
Could have special lanes for self driving cars that adhere to certain protocols, like we have today for HOV lanes.
[flagged]
[flagged]
And that opinion is why the public transit system in almost every American city is barely functional.
We’re not allowed to have nice things because some car driver might be slightly inconvenienced on occasion.
why not?
Because the average American is pathologically selfish.
s/American/human/g
What about fire engines?
[flagged]
I don't even necessarily agree with them, but what's so terrible about that opinion that you'd say it's awful and should be flagged and legislated against?
> automated decision systems
So if a bank has an automated loan approval system that consists of a series of IF-THEN statements, and one of those statements amounts to IF (applicant.race != "White"), loan.reject; this ban would forbid a state from taking action?
No because there are other laws that have nothing to do with "automated decision systems" which prohibit discrimination based on protected class.
How could you prove discrimination if you can't audit the decision making system? The article mentions NY regulations regarding bias audits in systems used to make hiring decisions as one casualty of this new law.
> New York's 2021 law mandating bias audits for AI tools used in hiring decisions would also be affected, 404 Media notes.
Just change the law to mandate bias audits for all hiring decision software, whether it is automated or not.
Wouldn't that changed law encompass the automated software as well, thereby being in violation? I am not a lawyer, but the ridiculously broad language in the spending bill doesn't seem to leave much room for that sort of thing.
The spending bill thing says the law can't target X, so you target a superset of X instead.
Suppose the bill said "no laws about horses!". Okay then if you want to make a law regulating the manufacture of horse shoes, you target the law to "odd-toed ungulates" instead.
You sue them and can find out in discovery.
Oh cool all I have to do is go sue a giant bank, I’ll let you know how that goes
Is this constitutional? It sounds like a pretty clear breach of the anti-commandeering doctrine. The federal government can't simply issue commands to state legislatures.
Federal law might supersede state law in areas where the federal government has express powers, e.g. interstate commerce, but if a state is adding AI-related provisions to existing policy in an area it already has authority over, I can't imagine how Congress could attempt to suppress that.
Sure, federal law could likely supersede state law if a state is trying to restrict AI as a commercial service in itself, as that would cross into interstate commerce territory. But if a state already has regulatory authority over e.g. how insurance companies operate within their jurisdiction, adding provisions that relate to how AI is used in the process of providing insurance coverage doesn't seem like something the Congress could legitimately intervene in.
This is going to be disastrous for hospitals and doctors, because they're facing a massive surge of (likely AI powered) denials and individual states are regulating it - this would ban that.
It's not like the laws prohibit any use of AI, it's literally basic safeguards and human in the loop provisions but the text of the bill as written would make those laws illegal.
Which is not surpsing considering it comes coupled with massive cuts in Medicaid - private Medicaid plans are some of the most egregious players in terms of denials.
Here is a link to some info about what states have been doing: https://stateline.org/2025/03/25/states-try-to-rein-in-healt...
Here is a simple website which uses the 5calls API to get your reps and gives you a script to talk to them about this https://www.deny-ai.com/call-your-representatives
> regulating artificial intelligence models, artificial intelligence systems, or automated decision systems
Seems trivial to work around since there is no legal definition of AI.
Instead of making your law specific to AI system, you can simply make it slightly broader in scope so it includes AI systems in practice.
For example, prohibition on AI facial recognition in public spaces -> prohibition on any computerized facial recognition
“Automated decision systems” seems pretty broad to me. It would potentially also include a lot of non-AI systems. See, for example, https://en.wikipedia.org/wiki/Automated_decision-making.
The court system is responsible for interpreting law so playing semantic games doesn't work very well unless you have a corrupt judge who sides with you and is willing to interpret it in this way
I don't think it'd be difficult to blur the lines in terms of implementation details in order to "AI launder" desired functionality in such a manner where the "AI" distinction becomes a philosophical debate between SMEs.
I think you can blur the lines to an extent and if the judge isn't very capable, pull one over on them, but to say an LLM or ML model is not artificial intelligence to a capable judge with a prosecution who points out there is an LLM, it would be hard to get around imo
"playing semantic games" is a euphemism for the entire practice of law.
I suppose favoring "state's rights" over federal regulation is only a concern for the GOP when they're not getting big tech lobbyist money.
I think it's good to realize that many people's commitment to "American" values is weak at best. Things like state's rights, equal representation in government, and even "freedom of speech" are often political tools rather than actual values.
Reading basic history shows it's always been this way. As a simple historical example the soon to be Confederate states complained about "state's rights" for slavery but when they seceded they enshrined slavery in their constitution and notably didn't leave it up to their states (so clearly that institution was more important to them than state autonomy). It's always been a convenient veneer over policy.
Very interesting, but are you sure about that example?
Const. of C.S.A. art. I, § 9, ¶ 4 restricted their federal legislature's power:
> No bill of attainder, ex post facto law, or law denying or impairing the right of property in negro slaves shall be passed.
The next section similarly restricted the states' power to "pass any bill of attainder, or ex post facto law" but did not reference slavery.
One things consistent though, a bunch of rich guys banding together to lower their taxes.
its most about where they have and don't have power. the goal is acquisition of power, not some kind of principled stand.
The point they are making is that for decades GOP would cry states' rights whenever Democrats did something at the federal level but whenever they are in power, states' rights suddenly don't matter.
And we should ignore those cries and not discuss them like there is anything deeper.
That's not possible. The Republican propaganda machine is very effective and there are many supporters who will vote with the GOP because "states' rights" but will simultaneously vote for all these policies. I don't think the fiercely logical part of HN understands politics and thinks you can just out debate someone into changing their mind but if it was that simple, the GOP would've died out decades ago.
People who love dunking on conservatives on twitter don’t realize that saying absurd stuff and then watching liberals wind themselves into a tizzy is like 95% of the GOP’s media strategy.
You're being downvoted but you're right and if people spent any time in conservative spaces they would know this. "Owning the libs" has become a meme but this is the point to a lot of what Republicans do.
This is far too sweeping, but when you have California seemingly intent on smothering our AI industry in its crib it makes sense that they’re scared.
That said, I think it’d be smarter of the GOP to let California do just that. It’s a chance to move that tech money out of California and into another more regulation friendly state.
Waymo seems to be operating smoothly in San Francisco. OpenAI's headquarters are also there. Many AI startups are also based in San Francisco, California.
Right, but you might want to look at the regulations their lawmakers have been proposing. If put in place it would put a stop to that pretty much immediately.
Or California could trailblaze proper regulation! Thanks for posting about the efforts! I'm going to see if I can support them in any way.
If you could get the whole world on board, sure. However, many other countries aren't going to play ball. When there's a chance (even if it's tiny) that AGI is coming that becomes a huge matter of national security, nevermind economic dominance.
California is not a reliable entity to entrust with such lofty ambitions.
This is the same state that banned plastic bags to "save the environment" - did they mandate paper bags then? Renewable, compostable, organic paper? No! They allowed plastic bags to be replaced with... Super thick plastic bags! Which I assure you, stores go through at least 80% as many as before because people usually don't bring bags, but now they're 4-5x the plastic.
And they added a ton of regulation on straws based on that literal child's insane napkin math that went viral, that claimed that America uses 7 or 8 straws per man, woman, and child, per day. Now we get to use multiple paper straws that dissolve in your cup immediately.
California is awash in best-intentions, but utterly useless and counterproductive, regulation. Just another downside to one-party rule. Neither party does a good job with zero counterbalance to their power and ideas.
> This is the same state that banned plastic bags to "save the environment" - did they mandate paper bags then? Renewable, compostable, organic paper? No!
https://skeptoid.com/episodes/4460
"When the UK Environment Agency did a life cycle assessment of supermarket carrier bags (PDF) they found that non-woven polypropylene bags needed to be re-used at least 11 times to have lower global warming potential than single-use HDPE, or High-Density Poly-Ethylene, bags. Cotton shopping bags need to be used at least 131 times. Paper bags were the big losers. They aren't likely to survive the 4 uses needed to reach the same global warming potential, but are much more toxic to produce than plastic."
11 uses out of a reusable bag is not a tough threshold to hit. I've got one I know is from 2018 in daily use still and has crossed the Pacific several times.
"States's rights" has always been coded language. Lee Atwater's post Nixon interview gave away the playbook. The hypocrisy is easy to see in that lens. First it started with racial slurs, then welfare queens, racial slurs, big government, states rights, occasional "liberty and freedom" thrown in for good measure. Currently it's DEI and trans.
https://www.thenation.com/article/archive/exclusive-lee-atwa...
The states do have rights though, Madison wrote a federalist paper about it and these so called "States" even got their own amendment.
But now that you have let me know I am racist and transphobic because I believe the 10th amendment exists, I've got to do some soul searching to do. My whole life is a lie, will someone let Pennsylvania know gently they don't have rights?
Sure, states have rights.
But the political slogan "states' rights" has historically significant usage and connotations that go far beyond that simple fact.
It’s not coded, it’s just bs. Watching democrats cry hypocrite while also resharing the article/tweet/news clip is literally their media strategy.
States rights don't include control over federal spending, even for someone in the GOP
I mean, obviously. But the provision says:
> no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act
States never got to control Federal spending, AI or otherwise.
But the Tenth Amendment pretty strictly limits how much the Feds can control state spending and legislation, too.
This doesn’t have anything to do with the 10th Amendment (little does).
This is a straightforward declaration of Commerce Clause authority. This SCOTUS has made it clear the “Dormant Commerce Clause” is not stirring awake, so if Congress wants to preempt state regulation of interstate commerce they have to do so explicitly.
> This doesn’t have anything to do with the 10th Amendment (little does).
Sure it does.
It's basically https://en.wikipedia.org/wiki/South_Dakota_v._Dole again.
The Feds can't say "you can't regulate in a way we don't like" to the states. They can apply "reasonable conditions" along the lines of "if you do x we will take away related funding y" but the entire point of the Tenth Amendment is that states have more rights than the Feds unless otherwise stated.
Federalism as protected by the Tenth means California can require things like "may contain lead" on labels, even though some of those products may be sold outside of California.
The people that are championing this sort of stuff, what's your take on social credit systems (like in China), or just total surveillance?
I'm asking, because my take is that totally unregulated AI will sooner or later lead to such applications. And you can't really advocate that privacy laws will stop that - after, that would hinder the progress of things like "automated decision systems".
if you ever wanted to obliterate any consumer confidence in a market thats already routinely mocked, loathed and derided...i can think of no better way than to ensure it is fecklessly unaccountable to any sort of regulation.
What could possibly go wrong? is no longer an exclusion but an enumeration I guess. Everything you can think of probably will. Could things like this be repealed when someone who knows what they’re doing steps in?
Since it's a reconciliation bill, is this likely to make it past the "Byrd bath"? It's looped in with a $500M AI modernization fund but my simplified understanding is that items not related to budget can be challenged and removed. Couldn't find reference to this in any of a few news articles.
I like how this is the exact opposite of what the EU's doing: https://artificialintelligenceact.eu/
State and local governments cannot regulate. This means that the leader still can issue executive orders, e.g., against AI wokeness. Republicans very much wanted to regulate just one year ago:
https://www.theguardian.com/us-news/2023/aug/21/artificial-i...
The GOP is trying anyway they can to bring about the end times
They don't realize they're all going to hell.
Queue the usual remarks about "automated decision systems": is the PID controller in an espresso machine an automated decision system, is a pacemaker, Cochlear implant, fuzzy logic controller in a rice cooker, etc.
If a company has the ability to say some magic words and remove all regulation, those words are gonna be said in every possible case.
> Queue the usual remarks about "automated decision systems": is the PID controller in an espresso machine an automated decision system, is a pacemaker, Cochlear implant, fuzzy logic controller in a rice cooker, etc.
Almost certainly yes. The provision defines it as
> The term "automated decision system" means any computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues a simplified output, including a score, classification, or recommendation, to materially influence or replace human decision making.
https://d1dth6e84htgma.cloudfront.net/Subtitle_C_Communicati...
The big difference here is that leading AI companies are primarily data companies. If Amazon, Meta, Google, what have you decides they want to develop AI products for that insurance market, and they get to do so with zero oversight - in the name of "but we have to beat China!" - I can foresee a number of ways it will end up badly for the consumer.
Startup idea #34932: AI-enabled espresso-machine which adjusts caffeine levels based on biometrics (heart rate, dark circles under eyes, jitters, etc)
Any lawyers on HN - Is this even legal in the first place? Surely this is a 10th Amendment violation?
The interstate highway system was considered to be allowed under the power to legislate for national defense. AI development doesn't seem to be less relevant to defense than roads.
The interstate system is not exclusively under Federal control.
For example, states can allow people to drink under age 21, on the interstate highways they own.
But the Feds can refuse to pay for the highways if they do.
https://en.wikipedia.org/wiki/South_Dakota_v._Dole
That system depends on pulling funding for roads if they don’t follow the rules. Technically any state can opt out if they don’t receive any highway funding. Given the government isn’t giving large AI funding to states, they can’t do the same here.
> AI development doesn't seem to be less relevant to defense than roads.
Seems much less relevant to me but maybe my thinking is too small minded (probably because I don't believe LLMs are a path to AGI).
That sorta thing doesn't really matter anymore
AI regulation arguably falls under the Commerce Clause.
The Commerce Clause has been read so broadly thanks to Wickard v Filburn that almost everything falls under it by default. The current Supreme Court seems at least skeptical of that interpretation but it is difficult to say if they will ever change it.
We're talking about AI use by corporations, right? That's pretty directly a commerce issue.
That's the broad interpretation they are talking about. The Commerce Clause isn't just "is it commerce". The federal government doesn't (didn't) have control of all commerce by default.
Establishing uniform standards for regulation of commerce is squarely within the wheelhouse of the Commerce Clause. It's not the stretch you're trying to portray it as.
It is federal government making rules about how federal money can be spent. Why is this wrong? States are free to raise their own taxes and spend them how they see fit. If they want federal funding, then they must cooperate with federal rules. Seems logical.
> It is federal government making rules about how federal money can be spent.
This is an outright lie. The relevant bit of legislation is cited in the article:
"no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence systems, or automated decision systems during the 10 year period beginning on the date of the enactment of this Act"
> States are free to raise their own taxes and spend them how they see fit.
The language above very clearly forbids them from spending said tax revenue on regulating AI.
The same folks have long been salty that California sets higher standards for vehicle emissions (https://en.wikipedia.org/wiki/Emission_standard#State-level_...) and are looking to kneecap that sort of action here.
[flagged]
I will echo the sibling comment here. What's the correct way of looking at this then?
Go on, give us the correct framing.
You're going to be waiting a while, I think.
[flagged]
Depends on your perspective.
If you are the owner of one or more AI companies, it could be very good for you.
How about more of your thought process and reasoning instead of a blanket ad hominem attack?
That's not an ad hominem attack.
Calling someone a name is not an ad hominem.
Using something about someone to undermine an argument is an ad hominem.
People against AI regulations are lunatics: Not ad hominem. That guy is a lunatic so his opinion about AI regulations are also insane: Ad hominem.
You’re saying the same thing twice. One just more blatantly than the other. The point of calling someone a lunatic is to undermine their opinion. It’s ad hominem either way even if one is more subtle.
> The point of calling someone a lunatic is to undermine their opinion.
I don't see that as the point. I think they were just opining. There doesn't seem to be an argument they are making or refuting so this cannot, by definition, be an "argument ad hominem" (argument to the person).
The parent comment to yours nails it with the given examples. The first is the conclusion to arguments -- which they did not expound on -- "this opinion is lunacy"; and the second is specific argument that "you are a lunatic" and conclusion that therefore "your opinion is lunacy". That is an argument to the person.
(Not to speak to the quality of OP's comment for this message board. A comment explaining why they think it's crazy would be preferred.)
I disagree. Using such a broad definition means any negativity during a discussion can be called out as an "ad hominem" (and it happens all the time).
[flagged]
Do you really think having 50 completely different regulatory regimes is somehow better?
This isn't so bad. Also, since when did HN become pro-tech-regulation (especially pro-emerging-tech regulation)? It is a weird change.
HN shifts when tech becomes dangerous.
There has been a shift against destructive social media and addictive phones too.
Idk what Libertarian dreamland you people live in. Go try and live in a country with no regulations for things like pollution, hazardous waste, aviation safety, etc, etc and let us know how long you last. Powerful technologies necessarily can be dangerous. AI is going to be a major change driver and honestly if you deny that it's an incredibly powerful technology then I don't know what else to say.
> Do you really think having 50 completely different regulatory regimes is somehow better?
Yes, that is a less bad problem than a law that bans any enforcement of such regulations. 50 different sets of regulation just means that companies will change policy to adhere to the most strict of them (presumably California's) and be done with it.
Completely agree, 10 years is an eternity in AI development. AGI will 100% be here before then.
that's a very confident %
If you read the article, it is clear that it doesn't block any and all regulation of AI. It says states cannot make federal funding follow non-federal rules around AI. The federal government may actually have more regulations than states, and this would require states to do a better job.
Now you get a grey area when AI is being added to everything to use as an excuse to avoid state laws
Read harder. There is zero connection to receipt of federal funding.
https://d1dth6e84htgma.cloudfront.net/Subtitle_C_Communicati...
IN GENERAL.-Except as provided in paragraph (2), no State or political subdivision thereof may enforce any law or regulation regulating artificial intelligence models, artificial intelligence sys-tems, or automated decision systems during the 10-year period beginning on the date of the enactment of this Act.
It’s an across the board blanket preemption.
There's no indication the Feds are going to take any initiative on this at all. They lap up Altman's pandering at Senate Hearings and will do nothing.
> The federal government may actually have more regulations than states
There would be a chance of this being true if the language didn't bar states from enforcing their own regulation; there's no point to that except for a worry that some states' regulation will be more strict.
It's also very clear this doesn't pass Tenth Amendment muster.
(Or shouldn't, at least.)
AI has significant impacts on interstate commerce, which the feds have basically carte blanche to regulate.
[flagged]
The GOP, the party of small government. Telling everybody what they can and cannot do.
I don't love heavy-handed approaches like this, but it does seem very in line with small government politics.
Essentially, this is saying that the executive can't create regulations that add regulations that limit what businesses can do (which would be relevant when the party in power of the executive changes)
it specifically targets the states - the executive seems free to create those regulations by my reading?
Whoops, yep you are completely right and I was totally wrong! Bad day for my reading comprehension!
The executive doesn't care what the law says. This law constrains the states. The opposite of what the GOP believes in almost every other things where the states are sufficiently -ist/-phobic.
[flagged]
[flagged]