Live from the 119th Congress

AI Regulation

By Legisletter Editors·Updated yesterday

One AI bill is already law. A second just cleared the Senate. And Congress has introduced dozens more, fighting over who sets AI rules at all, whose kids are safe on chatbots, which chips can leave the country, and whether the data-center build-out gets a pause. Stay ahead of what passes, and mobilize constituent pressure.

The state of play

The TAKE IT DOWN Act, a bipartisan ban on nonconsensual intimate imagery that explicitly covers AI-made deepfakes, was signed into law in May 2025. In January 2026 the Senate passed the DEFIANCE Act, which lets deepfake victims sue for up to $250,000. The NO FAKES Act, H.R. 5272, and a Senate companion from Sen. Klobuchar cover voice, image, and election deepfakes. This is the AI track that has actually moved.

The bigger fight is who gets to set AI rules at all. After the White House issued a December 11, 2025 executive order pushing federal AI standards, House and Senate GUARDRAILS Acts plus a States' Right to Regulate AI Act would void it outright. On the other side, Rep. Baumgartner's American AI Leadership and Uniformity Act would write federal preemption into statute, and Sen. Cruz's SANDBOX Act would create a federal regulatory sandbox for AI experimentation.

The algorithmic-oversight bloc is led by Rep. Yvette Clarke. Her AI Civil Rights Act and Algorithmic Accountability Act would force federal audits, bias tests, and FTC reporting for AI used in hiring, housing, credit, health care, and policing. Sen. Wyden has the Senate version. Rep. Beyer's AI Foundation Model Transparency Act targets the frontier labs directly. Three national-security bills build a China-export firewall. Two new 2026 bills take direct aim at kids' exposure to AI chatbots. At the edges of the debate, Sen. Sanders's AI Data Center Moratorium Act wants a pause on new data-center construction while Congress studies the energy, water, and climate toll of the AI build-out.

Featured speech

Sen. Bernie Sanders proposes moratorium on AI data center construction

Sen. Bernie Sanders·Mar 25, 2026·S. 4214

Sanders makes the case for a federal pause on AI data-center construction. The loudest 'step on the brakes' voice in the Senate right now on the energy, water, and climate toll of the AI build-out.

Take action

Send a letter to your senators about AI regulation

Congress is actively debating these bills. Tell your representatives where you stand — we'll draft the letter for you.

Start your advocacy campaign
The bills

What Congress is working on

Recent activityH.R. 8283 moved·H.R. 8382 moved·H.R. 5511 +1 cosponsor (Ilhan Omar)·H.R. 7997 +1 cosponsor (Derek Schmidt)·H.R. 8283 +2 cosponsors (John Moolenaar, Michael Lawler)·H.R. 6996 +1 cosponsor (Bill Huizenga)·H.R. 6529 +1 cosponsor (Judy Chu)
Signed into Law1 bill
  • S. 146
    TAKE IT DOWN Act

    The first AI-era federal law. Criminalizes posting nonconsensual intimate images, including AI deepfakes, and forces platforms to take them down within 48 hours. Signed into law May 2025.

    21 cosponsors·Last action May 19, 2025
    +16 more
Active in Congress1 bill
In Committee29 bills
Recently Introduced1 bill
  • S. 1837
    DEFIANCE Act of 2025

    Gives deepfake victims a federal civil lawsuit path worth up to $250,000 in damages, plus court-ordered takedown. Senate-passed January 2026.

    8 cosponsors·Last action Jan 13, 2026
    +3 more
On the Record

What Congress Is Saying

I rise today to ask the Senate to pass the DEFIANCE Act — bipartisan legislation that gives victims of nonconsensual, sexually explicit deepfakes the tools to fight back against those who would exploit them. With the push of a button, generative AI can swap someone's face onto another person's body, remove that person's clothing so they appear nude, or undress someone to show them in lingerie or other exposed positions. That is why this legislation is critical, because this legislation says that if they are guilty of such reckless misconduct, they can be sued for it and held civilly liable for the damages.
Richard Durbin(DIL)·on DEFIANCE Act of 2025·
Mr. Speaker, today, I rise to speak for consideration of H.R. 3679, the Small Business Artificial Intelligence Advancement Act. America has seen great innovation in its 250 years, from the lightbulb to the internet and now artificial intelligence. The benefits of AI must be made available for more than big corporations. They must reach every facet of our economy. Small businesses are the backbone of the United States economy, representing nearly 43 percent of the U.S. GDP and employing 46 percent of the workforce. These are our neighbors, our friends, and our families.
Mr. Speaker, due to unforeseen circumstances, I was unable to cast my vote for S. 146, TAKE IT DOWN ACT. Had I been present, I would have voted YEA on Roll Call No. 104.
Where Congress stands

Pro-industry vs AI-guardrails voices

Ranked by bill-level activity. Pro-industry bills push federal preemption, export enablement, and regulatory sandboxes (HR5388 AI Leadership, SANDBOX Act, GAIN AI, Full AI Stack Export, Small Business AI). Guardrails bills push victim protection, civil-rights oversight, algorithmic accountability, kids' safety, data-center transparency, and state rule-making.

Activity weights: sponsor = 10 points, cosponsor = 3 points. A legislator appears in the cluster where their score is highest. Number next to each name is how many pillar bills in that camp they're on.

AI Regulation Alerts

Get notified when AI regulation moves in Congress

New bills, floor votes, cosponsor shifts, regulatory actions — straight to your inbox.

AI Regulation alerts + Legisletter's monthly briefing. Unsubscribe anytime.

Why the TAKE IT DOWN Act became the first AI-era federal law

For most of the last decade, AI regulation lived on the edges of federal policy. Agency guidance, state-by-state patchwork, and executive-order volleys, rather than statutes. That changed in May 2025 when the TAKE IT DOWN Act was signed into law. The bill cleared Congress with bipartisan supermajorities because it bundled a concrete harm (nonconsensual intimate imagery, including AI deepfakes) with a concrete fix (federal criminal penalties plus a 48-hour platform takedown rule enforced by the FTC). That coupling of clear victim and clear remedy is what made the first AI-era federal law happen on NCII rather than bias or preemption. The DEFIANCE Act, which the Senate passed in January 2026, builds on the same formula by giving victims a private right of action with up to $250,000 in civil damages. The NO FAKES Act would extend that same logic to voice and likeness cloning, and H.R. 5272 and S. 1213 would reach political deepfakes aimed at federal candidates.

Federal vs state AI rules: the December 2025 executive order fight

On December 11, 2025 the White House issued an executive order pushing federal AI standards that override state AI rules. Within three months, three bills had moved to kill it: the House GUARDRAILS Act with 34 cosponsors led by Rep. Beyer, the Senate GUARDRAILS Act from Sen. Schatz, and the States' Right to Regulate AI Act from Sen. Markey. Each bill cancels the order and blocks federal dollars from being used to carry it out. On the other side, Rep. Baumgartner's American AI Leadership and Uniformity Act would write federal preemption into statute, and Sen. Cruz's SANDBOX Act would create federal regulatory exemptions for AI firms testing new products. The fight is not about whether AI gets regulated; it is about who regulates it.

The data-center energy backlash: a new coalition emerges

The AI build-out is quietly becoming an energy story. Three bills in this pillar attack it from different angles. Sen. Sanders's AI Data Center Moratorium Act would pause new construction while Congress studies the footprint. Sen. Durbin's Data Center Water and Energy Transparency Act would force disclosure without the pause. Rep. Landsman's Protecting Families from AI Data Center Energy Costs Act, the populist pitch with 15 cosponsors, would make utilities show how much data-center power demand is raising household bills. Together they signal a new coalition: climate, labor, and consumer advocates converging on AI's physical infrastructure as the policy target, not the algorithms.
Common questions

Questions about AI regulation

Is AI regulated by federal law in the United States?

Only narrowly. The TAKE IT DOWN Act, signed into law in May 2025, is the first AI-era federal statute. It bans nonconsensual intimate imagery, including AI-made deepfakes, and requires platform takedowns within 48 hours. Beyond that, AI is governed by a patchwork of executive orders, state laws, and agency guidance. The Algorithmic Accountability Act, AI Civil Rights Act, and NO FAKES Act would extend federal reach into bias testing, civil-rights enforcement, and voice-likeness protection, but none have cleared Congress.

What is the Trump AI executive order and what would Congress do about it?

On December 11, 2025 the White House issued an executive order setting federal AI standards that override state AI rules. Three bills in this pillar would void it outright: the House GUARDRAILS Act, Senate GUARDRAILS Act, and States' Right to Regulate AI Act. Each would cancel the order and bar any federal money from being used to carry it out. Beyer's House version has the most momentum with 34 cosponsors. On the pro-preemption side, Rep. Baumgartner's American AI Leadership and Uniformity Act would write federal preemption into statute.

Are AI deepfakes illegal?

Sexual deepfakes are now a federal crime under the TAKE IT DOWN Act. AI-generated political deepfakes are not yet illegal at the federal level. The Protect Elections from Deceptive AI Act and its Senate companion would change that for federal candidates. Voice and image cloning without consent is targeted by the NO FAKES Act, which creates a federal right to sue. The DEFIANCE Act, which the Senate passed in January 2026, gives deepfake victims up to $250,000 in civil damages.

Can companies be sued for AI bias?

Not under a dedicated federal AI law, but three bills in this pillar would make it much easier. The AI Civil Rights Act treats algorithmic discrimination in hiring, housing, credit, health care, and policing as a civil-rights violation. The Algorithmic Accountability Act and its Senate version from Sen. Wyden require large firms to test AI systems for harm and file bias-audit reports with the FTC. Clarke's two House bills together have more than 50 cosponsors.

What AI bills target China and chip exports?

Four bills in this pillar handle the China and export front. The Foreign Adversary AI Risk Assessment Act would require a federal review of frontier AI models accessed by Chinese, Russian, or Iranian entities. The China Advanced Technology Monitoring Act mandates annual reports on China's semiconductor and AI-chip manufacturing. The Deterring American AI Model Theft Act criminalizes exfiltration of US-developed AI model weights. On the export-enabling side, the GAIN AI Act gives US buyers a first chance to purchase advanced AI chips before export, and the Full AI Stack Export Promotion Act frames the broader US AI stack as an export priority.

Are there federal bills on kids and AI chatbots?

Yes. Two new bills in this pillar take direct aim. H.R. 8382 would ban manufacturing or selling children's products that include an AI chatbot component, the first federal product-level restriction on kids' AI. The Youth AI Privacy Act is the Senate companion, establishing a federal privacy floor for minors using AI systems. Both are new 2026 introductions, arriving amid state-level scrutiny of AI companion apps marketed to teens.

What bills address AI's energy and data-center footprint?

Three bills attack it from different angles. Sen. Sanders's AI Data Center Moratorium Act would pause new construction while Congress studies the footprint. Sen. Durbin's Data Center Water and Energy Transparency Act would force disclosure without the pause. Rep. Landsman's Protecting Families from AI Data Center Energy Costs Act, the populist pitch with 15 cosponsors, would make utilities show how much data-center power demand is raising household bills.

What protections exist for AI whistleblowers?

None yet at the federal level. The AI Whistleblower Protection Act, led by Sen. Chuck Grassley (R-IA), would let workers and contractors at AI companies report safety risks or legal violations to the government, Congress, or their bosses without being fired or retaliated against. Its bipartisan Republican sponsorship is unusual in a pillar where most oversight bills are Democrat-led.

Written by
Legisletter Editors

Legisletter is a grassroots advocacy platform tracking federal policy — and the impact it lands on everyday Americans.

Data sources: congress.gov · govinfo.gov · lda.gov · sec.gov