Trump’s AI executive order promises ‘one rule book’ – startups may face legal limbo instead

US President Donald Trump displays an executive order on artificial intelligence he signed at the "Winning the AI Race" AI Summit at the Andrew W.

President Donald Trump signed an executive order Thursday night directing federal agencies to challenge state AI laws, arguing that startups need relief from a “patchwork” of regulations. Legal experts and startups, meanwhile, say the order could prolong uncertainty and trigger legal battles that leave young companies navigating shifting state requirements while waiting to see if Congress can agree on a single national framework.

The order, titled “Securing a National Policy Framework for Artificial Intelligence,” directs the Justice Department to establish a task force within 30 days to challenge certain state laws on the grounds that AI is interstate commerce and should be federally regulated. It gives the Commerce Department 90 days to compile a list of “burdensome” state AI laws, a rating that could affect states’ eligibility for federal funds, including broadband grants.

The order also asks the Federal Trade Commission and the Federal Communications Commission to examine federal standards that could preempt state regulations, and directs the administration to work with Congress on a uniform AI law.

The order lands amid a broader push to rein in state-by-state AI regulations after congressional efforts to pause state regulation stalled. Lawmakers in both parties have argued that without a federal standard, blocking states from acting could leave consumers vulnerable and businesses largely unchecked.

“This David Sacks-led executive order is a gift to Silicon Valley oligarchs who use their influence in Washington to shield themselves and their companies from accountability,” Michael Kleinman, head of US Policy at the Future of Life Institute, which focuses on reducing extreme risks from transformative technologies, said in a statement.

Sacks, Trump’s AI and cryptopolicy czar, has been a leading voice behind the administration’s AI prevention.

Even supporters of a national framework concede that the order does not create one. With state laws still enforceable unless courts block them or states pause enforcement, startups may face an extended transition period.

Techcrunch event

San Francisco
|
13.-15. October 2026

Sean Fitzpatrick, managing director of LexisNexis North America, UK and Ireland, tells TechCrunch that states will defend their consumer protection authority in court, with cases likely to escalate to the Supreme Court.

While supporters argue the order could reduce uncertainty by centralizing the fight over AI regulation in Washington, critics say the legal battles will create immediate headwinds for startups navigating conflicting state and federal requirements.

“Because startups prioritize innovation, they typically don’t have … robust regulatory governance programs until they reach a scale that requires a program,” Hart Brown, lead author of Oklahoma Governor Kevin Stitt’s Task Force on AI and Emerging Technology recommendations, told TechCrunch. “These programs can be expensive and time-consuming to meet a very dynamic regulatory environment.”

Arul Nigam, co-founder at Circuit Breaker Labs, a startup red-teaming conversational and mental health AI chatbots, echoed those concerns.

“There is uncertainty in relation to, do [AI companion and chatbot companies] should self-regulate?” Nigam told TechCrunch noting that the patchwork of state AI laws is hurting smaller startups in his field. “Are there open source standards they have to adhere to? Should they continue to build?”

He added that he hopes Congress can move faster now to pass a stronger federal framework.

Andrew Gamino-Cheong, CTO and co-founder of AI management firm Trustible, told TechCrunch that the EO will backfire on AI innovation and pro-AI goals: “Big Tech and the big AI startups have the means to hire lawyers to help them figure out what to do, or they can simply hedge their bets. The uncertainty hurts the funds that almost hurt the funds that almost hurt the funds. Will,” he said.

He added that legal ambiguity makes it harder to sell to risk-sensitive customers like legal teams, financial firms and healthcare organizations, increasing sales cycles, system work and insurance costs. “Even the perception that AI is unregulated will reduce trust in AI,” which is already low, and threatens adoption, Gamino-Cheong said.

Gary Kibel, a partner at Davis + Gilbert, said companies would welcome a single national standard, but “an executive order is not necessarily the right vehicle to override laws that states have duly passed.” He warned that the current uncertainty leaves two extremes open: very restrictive regulations or no action at all, either of which could create a “Wild West” that favors Big Tech’s ability to absorb risks and wait things out.

Meanwhile, Morgan Reed, president of The App Association, urged Congress to quickly adopt a “comprehensive, targeted and risk-based national AI framework. We can’t have a patchwork of state AI laws, and a protracted court battle over the constitutionality of an Executive Order is no better.”