According to CNET, President Donald Trump signed an executive order on Thursday, December 4, 2025, aiming to block state regulations on artificial intelligence. The order, titled “Ensuring a National Policy Framework for Artificial Intelligence,” argues that state laws create a problematic patchwork and calls out Colorado’s rules on “ideological bias” in AI models. It directs the administration to set up an AI litigation task force within 30 days to challenge state laws. Within 90 days, Commerce Secretary Howard Lutnick must publish a report on state laws that conflict with the order or violate the Constitution. The order also threatens to withhold broadband development funding from states that don’t comply, though it spares some “lawful” state laws on child safety and government procurement.
The Legal Battle Just Began
Here’s the thing: this is basically a declaration of war on statehouses. And it’s happening because Congress, for the second time, chose not to pass a law that would do this very thing. So the White House is trying to use executive authority instead. But legal experts, like Travis Hall from the Center for Democracy & Technology, are already saying the power to preempt state law “rests firmly with Congress.” This is going to end up in court, probably sooner rather than later. The administration’s new task force isn’t just for show—it’s a signal that they’re ready to sue.
What States Are Actually Doing
Now, why is this such a big deal? Look at what states are already doing. It’s not some abstract debate. Florida passed a law making it a crime to create AI-generated sexual images without consent. Arizona has a law preventing AI from being used to deny health insurance claims automatically. These are direct responses to real, documented harms people are facing. The executive order says it wants to protect communities, but it’s simultaneously trying to kneecap the governments closest to those communities. It’s a massive contradiction.
The Big Tech Play
And who benefits from a single, federal framework? Big tech companies, for one. The article notes that firms like Google, Meta, and OpenAI have been lobbying for national standards. Can you blame them? Dealing with one regulator is a lot cheaper and easier than navigating 50 different sets of rules. For developers and enterprises, the promise of simplicity is tempting. But the risk is that a federal standard could become a lowest-common-denominator policy, weaker than what many states want. It’s a classic tension between innovation and protection, and right now, the order is squarely on the side of “innovation” at all costs.
What Happens Next
So what now? The next 90 days are critical. Secretary Lutnick’s report will be the hit list. We’ll see which state laws are in the crosshairs. The threat to withhold broadband money is a serious stick—states rely on that funding. But 35 states and D.C. just told Congress to back off, warning of “disastrous consequences” if their power is blocked. You think they’re just going to roll over because of an executive order? Unlikely. This sets up a messy, protracted fight where the only certainty is legal uncertainty. And in the fast-moving world of AI, that might be the most damaging outcome of all.
