- within Insolvency/Bankruptcy/Re-Structuring and Strategy topic(s)
Yesterday the Trump Administration issued an Executive Order (EO), "Ensuring a National Policy Framework for Artificial Intelligence," that directs the Department of Justice, the Department of Commerce, the Federal Communications Commission, the Federal Trade Commission, and other agencies to take action to preempt State AI laws.
This EO comes in the wake of unsuccessful attempts by some Republican legislators to preempt State-level AI regulation by attaching an "AI moratorium" to must-pass legislation earlier this year. Last month, rumors swirled of an attempt to include an AI moratorium in the National Defense Authorization Act or other legislation. The EO represents a new and parallel line of attack—one that aims to use the regulatory and enforcement powers of the Executive Branch to preempt or invalidate State AI regulation.
Overview of the EO
Echoing the Administration's earlier AI Action Plan, the EO begins by explaining its intent to ensure "United States leadership in Artificial Intelligence" to "promote American national and economic security and dominance across many domains." But, the EO goes on to say, "excessive state regulation" thwarts the imperative of the U.S. beating its adversaries in the AI race because state laws (1) create a patchwork of different compliance regimes; (2) promote or require that AI models include "ideological bias"; and (3) attempt to regulate beyond State borders. The EO specifically identifies Colorado's AI Act as "cumbersome regulation" that could "embed ideological bias within models." The stated goal of the EO is a "minimally burdensome national standard" for AI that also ensures that "children are protected, censorship is prevented, copyrights are respected, and communities are safeguarded." With State legislatures across the country considering a wide range of AI-related legislation, the EO sets a policy to "sustain and enhance America's global AI dominance through a minimally burdensome national policy framework for AI."
The EO charges federal actors and agencies to take the following actions:
- AI Litigation Task Force: The Attorney General is instructed to establish a Litigation Task Force to challenge State AI laws as a violation of the Commerce Clause, on preemption grounds, or through other legal theories.
- Federal Evaluation of "Onerous" State AI Laws: The Secretary of Commerce shall, within 90 days of the EO's issuance, "publish an evaluation of existing State AI laws that identifies onerous laws that conflict with the policy set forth" in the EO, as well as "laws that should be referred" to the Task Force.
- Restrictions on State Grant Funding: The EO directs executive departments and agencies to "assess their discretionary grant programs" and "determine whether agencies may condition such grants" on States either not enacting an AI law or requiring States that have enacted AI laws to sign a binding agreement not to enforce such laws. In particular, the EO directs the Secretary of Commerce to restrict federal non-deployment funding under the Broadband Equity Access and Deployment ("BEAD") program for States determined to have burdensome AI laws following the federal review. The BEAD Program, funded by the Infrastructure Investment and Jobs Act (2021), is a $42.5 billion federal grant program that aims to expand high-speed internet access.
- Reporting & Disclosure Standard: The FCC Chairman shall "initiate a proceeding to determine whether to adopt a Federal reporting and disclosure standard for AI models."
- Policy Statement Targeting AI Outputs: The FTC Chairman shall issue a policy statement articulating how the FTC Act's prohibition on unfair and deceptive acts and practices applies to AI model outputs. The statement will consider whether State laws that "require alterations to the truthful outputs of AI models are preempted."
- New Legislation: The Special Advisor for AI and Crypto and the Office of Legislative Affairs shall craft legislation to preempt State AI laws. The legislative recommendation "shall not propose preempting otherwise lawful State AI laws" relating to child safety protections; AI compute and data center infrastructure, other than generally applicable permitting reforms; State government procurement and use of AI; and other topics as shall be determined.
What Comes Next?
The EO kicks off agency processes and will likely spur litigation that will provide more details on federal AI policy and determine which State AI laws the federal government plans to target for preemption or other legal challenges. As those processes play out, organizations impacted by AI regulation and other stakeholders should monitor them closely even as they prepare to comply with the numerous State AI laws going into effect in the coming year. Some of the big questions include:
- Targeted State Laws: Which measures from which States will the Department of Commerce include on its list of "onerous" State AI laws? That list, which is due to be published by March 11, 2026, will be an important initial indication of how broadly the Administration's policy will sweep. The Trump Administration likely will target the most comprehensive AI legislation out of Colorado and California, but it remains to be seen whether the list will include the scores of other State laws that have been passed related to AI chatbot disclosures, watermarking and content provenance data, deepfakes, and transparency.
- Nature of Federal AI Policy: The EO emphasizes
that winning the competition against China for AI leadership is a
national security priority that demands a "minimally
burdensome" approach to regulation. But beyond that
"minimally burdensome" approach, what, precisely, are the
details of that forthcoming policy regime, and which agencies will
be responsible for defining it?
- The EO's directions to the FCC and FTC may be instructive in this regard. It directs the FCC to open a proceeding that could result in new federal policies addressing AI disclosures and reporting; and it directs the FTC to issue a policy statement considering the application of current federal law on deceptive trade practices to State laws that may require "alterations to the truthful outputs of AI models."
- Potential Conflict with the States on Child Safety and Other Key Issues: Leading up to the issuance of the EO, a bipartisan group of Governors and State Attorneys General, including several prominent Republicans, expressed their strong opposition to any effort by the Federal government to preempt State AI laws on key areas of concern, including child safety and copyright protections. Language in the Purpose section of the EO appears to be intended to assuage their concerns by stating that the minimally burdensome national standard for AI should allow for children to be protected, censorship to be prevented, and copyrights to be respected. However, this language is ambiguous and the EO offers no explicit promise that the Attorney General or Federal agencies will not target state AI laws regulating those subjects if they are viewed as too onerous. Additionally, the language noted above related to model legislation that the White House will draft makes clear that federal preemption legislation should carve out certain areas for continued State regulation, including: child safety protections; AI compute and data center infrastructure, other than generally applicable permitting reforms; State government procurement and use of AI; and other topics as shall be determined. As such, it remains to be seen to what degree the Federal government will target State laws on these topics. Any intrusion on State authority to regulate child safety or other highly salient issues in AI is likely to be met with aggressive push back from both Republican and Democratic State officials.
- Preemption: The EO instructs various agencies to take action to build out and enforce federal policy, so preemption challenges may vary significantly depending on the type of State AI law at issue and the content of the federal policy (or related agency action) cited as the basis for preemption. One preemption theory explicitly described in the EO is that the FTC Act preempts state laws that would require AI companies to adjust "truthful outputs." The FCC is also instructed to determine whether to adopt a federal "reporting and disclosure standard" for AI models.
- Litigation Task Force Strategy: Standing up a new litigation task force will require resources. Will sufficient personnel and other resources be available to the DOJ Task Force created by the EO, and how will DOJ approach its litigation responsibilities—State-by-State or issue-by-issue? Will it seek to litigate on multiple fronts or focus on single (perhaps more high-profile) targets, with the expectation that other States will modify their own agendas accordingly?
- Constitutional Challenges: Many States—some Republican-leaning and some Democratic-leaning—are expected to oppose the EO on federalism and anti-commandeering grounds. There could also be First Amendment implications to the EO's instruction to evaluate AI laws by assessing whether they require the alteration of "truthful outputs." But it remains to be seen whether States will affirmatively challenge the EO's directives or wait to defend their State's AI laws if they are challenged in court by the Attorney General's Litigation Task Force or threatened by other agency processes triggered by the EO. There may also be questions surrounding DOJ's authority to initiate certain types of actions to protect the United States' sovereign interests against States.
As these agency processes and litigation unfold, regulated companies and stakeholders across sectors should prepare for a potentially tumultuous recalibration of the Federal-State equilibrium in AI governance. Given the significant consequences, organizations should continue to evaluate opportunities to shape this process, whether in administrative proceedings, policy advocacy to state or federal law makers, or in litigation.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.