S. 1213: Protect Elections from Deceptive AI Act
Sponsor
Amy Klobuchar
Democrat · MN
Bill Progress
Latest Action · Mar 31, 2025
Read twice and Referred to Rules and Administration. for review
Why it matters
Introduced on 2025-03-31, the bill responds to rising fears that AI-made fake audio and video could mislead voters or donors in federal campaigns before officials can react.
S. 1213, the Protect Elections from Deceptive AI Act, would make it illegal for a person, political committee, or other entity to knowingly distribute materially deceptive AI-generated audio or visual media when carrying out a "Federal election activity" or when the media concerns a candidate for federal office. The bill is tightly focused on intent: the distribution must be for the purpose of influencing an election or soliciting funds. That means the target is not all synthetic media, but AI-made content used in federal politics to change votes or trigger donations.
The bill defines deceptive AI media in unusually specific terms. It covers an image, audio file, or video produced by artificial intelligence technology using machine learning, including deep learning models, natural learning processing, or similar or more complex computational techniques. The media must either merge, combine, replace, or superimpose content to create something that appears authentic, or generate a fully inauthentic image, audio, or video that appears authentic. It also uses a reasonable-person test: if an average person would come away with a fundamentally different understanding of a candidate's appearance, speech, or expressive conduct, or would believe a person said or did something they never actually said or did, the content can qualify.
The enforcement tool is civil, not criminal. A covered individual, defined as a candidate for Federal office, can ask a court for injunctive or equitable relief to stop distribution and can also sue for general or special damages. These injunction cases get precedence under the Federal Rules of Civil Procedure, which is important because election misinformation can do most of its damage quickly. But the burden is significant: the candidate bringing the case must prove a violation by clear and convincing evidence, and courts may award reasonable attorney's fees and costs to the prevailing party.
The bill also tries to avoid sweeping in journalism and obvious comedy. Radio, television, cable, satellite, and streaming services are exempt when the content appears in a bona fide newscast, news interview, news documentary, or on-the-spot coverage of bona fide news events, so long as they clearly acknowledge authenticity questions in a way easily heard or read by the average listener or viewer. Newspapers, magazines, and periodicals of general circulation, including internet or electronic publications that routinely carry news and commentary of general interest, are also exempt if they clearly state that the media does not accurately represent the speech or conduct of the covered individual. Satire and parody are exempt too, which means the real fights will likely center on what counts as deceptive, how clear disclosures must be, and how quickly courts can act during a campaign.
What does S. 1213 do?
Ban on knowingly spreading deceptive AI for 2 purposes
The bill bars any person, political committee, or other entity from knowingly distributing materially deceptive AI-generated audio or visual media when carrying out a Federal election activity or when the media concerns a candidate for Federal office, but only if the purpose is either (1) influencing an election or (2) soliciting funds.
Applies only to candidates for Federal office
The protected "covered individual" is defined specifically as "a candidate for Federal office," not state candidates, local candidates, or non-candidates. That means the bill's private right of action is limited to federal candidates under the Federal Election Campaign Act of 1971, 52 U.S.C. 30101 et seq.
Detailed AI definition includes deep learning
The bill defines deceptive AI media as an image, audio recording, or video created using machine learning, including deep learning models, natural learning processing, or computational techniques of similar or greater complexity, that either alters real content by merging, combining, replacing, or superimposing it, or generates wholly inauthentic content that still appears authentic.
Reasonable-person test for fake impressions
Content qualifies if a reasonable person, considering both the qualities of the image, audio, or video and the distribution channel, would have a fundamentally different understanding of the appearance, speech, or expressive conduct shown, or would believe a person engaged in conduct they did not actually exhibit.
Candidates can seek fast court orders
A candidate for Federal office may file for injunctive or equitable relief to stop distribution, and those injunction actions get precedence under the Federal Rules of Civil Procedure. The same candidate may also seek general or special damages, but must prove the violation by clear and convincing evidence.
News and parody exemptions require clear disclosures
Radio, television, cable, satellite, and streaming services are exempt when the material appears in a bona fide newscast, news interview, news documentary, or on-the-spot coverage of bona fide news events, but only if authenticity questions are clearly acknowledged in a way easily heard or read by the average listener or viewer. Newspapers, magazines, and internet/electronic periodicals of general circulation are exempt if they routinely carry news and commentary of general interest and clearly state the media does not accurately represent the covered individual's speech or conduct; satire and parody are also exempt.
Who benefits from S. 1213?
Candidates for Federal office
They get a direct civil remedy if deceptive AI content targets them. Specifically, a federal candidate can seek injunctive or equitable relief, ask for general or special damages, and potentially recover reasonable attorney's fees and costs if they are the prevailing party.
Voters in federal elections
Voters benefit from a rule aimed at stopping AI-made fake images, audio, and video that could create a fundamentally different impression of a candidate's speech or conduct. The bill is designed to reduce manipulation tied to influencing an election.
Legitimate news organizations
News outlets get explicit carve-outs. Radio, television, cable, satellite, streaming services, newspapers, magazines, and internet/electronic periodicals of general circulation can use or discuss suspect media in news coverage if they provide the required clear acknowledgment that authenticity is in question or that the material does not accurately represent the covered individual.
Comedy and satire creators
Creators making satire or parody benefit because the bill expressly exempts media that constitutes satire or parody, reducing the risk that obvious jokes are treated the same as election deepfakes.
Who is affected by S. 1213?
Political committees and campaign operatives
Political committees are named directly in the prohibition. If they knowingly distribute materially deceptive AI-generated media during Federal election activity for the purpose of influencing an election or soliciting funds, they could face lawsuits from the targeted federal candidate.
PACs, outside groups, and other entities
The bill applies not just to campaigns but to any "person" or "other entity." That could include super PACs, advocacy groups, consultants, and vendors that push deceptive AI content about a candidate for Federal office.
Streaming platforms and broadcasters
Radio, television, cable, satellite, and streaming services are affected because they must rely on the exemption's conditions if they air disputed AI media. To stay protected, they must present it as part of a bona fide news format and clearly acknowledge authenticity questions in a way easily heard or read by the average listener or viewer.
Online publications and digital publishers
Internet and electronic periodicals of general circulation are affected because they are exempt only if they routinely carry news and commentary of general interest and clearly state that the media does not accurately represent the speech or conduct of the covered individual.
What Congress Is Saying
S. 1213 hasn't been debated on the floor yet.
This section updates when a legislator speaks about it on the floor or in committee.
S1213 Legislative Journey
Committee Action
Mar 31, 2025
Read twice and referred to the Committee on Rules and Administration.
About the Sponsor
Amy Klobuchar
Democrat, MN · 19 years in Congress
Committees: Agriculture, Nutrition, and Forestry, Commerce, Science, and Transportation, Joint Committee of Congress on the Library
View full profile →
Cosponsors (4)
This bill has 4 cosponsors: 2 Democrats, 2 Republicans, reflecting bipartisan support. Cosponsors represent 4 states: Colorado, Delaware, Maine, and 1 more.
Committee Sponsors
Rules and Administration Committee
1 of 17 committee members cosponsored
7 Democrats across this committee haven't cosponsored yet. Mobilize their constituents
S. 1213 Quick Facts
- Committee
- Rules and Administration
- Chamber
- Senate
- Policy
- Government Operations and Politics
- Introduced
- Mar 31, 2025
Read twice and Referred to Rules and Administration. for review
Mar 31, 2025
S. 1213 Common Questions
Can federal candidates sue over AI deepfakes in campaign ads?
Yes. Under the Protect Elections from Deceptive AI Act, a candidate for Federal office can seek an injunction, equitable relief, and general or special damages for prohibited AI media distribution (Section 325(d)).
What is the burden of proof for an election deepfake lawsuit under S. 1213?
According to S. 1213 Section 325(d), the federal candidate suing must prove the violation by clear and convincing evidence.
Can courts stop election deepfakes quickly under the Protect Elections from Deceptive AI Act?
Yes. Under the Protect Elections from Deceptive AI Act, injunction actions get precedence under the Federal Rules of Civil Procedure, letting courts hear stop-distribution requests faster (Section 325(d)).
Does S. 1213 ban AI deepfakes used to solicit campaign donations?
Yes. Under S. 1213, knowingly distributing materially deceptive AI media is barred when done to influence an election or solicit funds (Section 325(b)).
Which candidates are protected by the Protect Elections from Deceptive AI Act?
Only candidates for Federal office. The bill defines the covered individual as a candidate for Federal office, not state or local candidates (Section 325(a)).
What counts as deceptive AI media under the Protect Elections from Deceptive AI Act?
Under the Protect Elections from Deceptive AI Act, it includes AI-made image, audio, or video that alters real content or creates fake content that appears authentic and would mislead a reasonable person (Section 325(a)).
Does the bill cover deepfakes made with deep learning or natural language processing?
Yes. According to S. 1213 Section 325(a), covered AI technology includes machine learning, deep learning, natural language processing, and similar or more complex techniques.
Can news channels or streaming services air election deepfakes if they disclose them?
Yes, in limited cases. Under the Protect Elections from Deceptive AI Act, broadcasters and streaming services are exempt for bona fide news coverage if authenticity questions are clearly disclosed (Section 325(c)).
Are newspapers and online news sites exempt if they label AI campaign fakes?
Yes. Under S. 1213 Section 325(c), newspapers, magazines, and internet publications of general circulation are exempt if they clearly state the media does not accurately represent the candidate's speech or conduct.
Does the Protect Elections from Deceptive AI Act exempt satire and parody?
Yes. The Protect Elections from Deceptive AI Act expressly exempts media that constitutes satire or parody (Section 325(c)).
Based on S. 1213 bill text
S. 1213 Bill Text
“To prohibit the distribution of materially deceptive AI-generated audio or visual media relating to candidates for Federal office, and for other purposes.”
Source: U.S. Government Publishing Office
Get notified when S. 1213 moves
Committee votes, floor action, cosponsor changes — straight to your inbox.
Bill alerts + Legisletter's monthly briefing. Unsubscribe anytime.
Government Operations and Politics Bills
9 related bills we're tracking
Protect Our Letter Carriers Act of 2025
Referred to the Committee on the Judiciary, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Feb 6, 2025
Rights for the TSA Workforce Act
Referred to the Subcommittee on Transportation and Maritime Security.
Mar 11, 2025
Fair Pay for Federal Contractors Act of 2025
Referred to the Committee on Appropriations, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Sep 30, 2025
SAVE Act
Received in the Senate.
Apr 10, 2025
Saving the Civil Service Act
ASSUMING FIRST SPONSORSHIP - Mr. Walkinshaw asked unanimous consent that he may hereafter be considered as the first sponsor of H.R. 492, a bill originally introduced by Representative Connolly, for the purpose of adding cosponsors and requesting reprintings pursuant to clause 7 of rule XII. Agreed to without objection.
Sep 16, 2025
Deceptive Practices and Voter Intimidation Prevention Act of 2025
Referred to the House Committee on the Judiciary.
Aug 5, 2025
SHARE Act of 2025
Referred to the Committee on Education and Workforce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Mar 25, 2025
End Crypto Corruption Act of 2025
Read the second time. Placed on Senate Legislative Calendar under General Orders. Calendar No. 71.
May 8, 2025
Designation of English as the Official Language of the United States Act of 2025
Referred to the Committee on Education and Workforce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Mar 3, 2025
Trending Right Now
Bills gaining momentum across Congress
Congressional Tribute to Constance Baker Motley Act of 2025
Referred to the House Committee on Financial Services.
Sep 11, 2025
Deterring American AI Model Theft Act of 2026
Referred to the House Committee on Foreign Affairs.
Apr 15, 2026
GUARDRAILS Act
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Mar 26, 2026
Tracking Government Operations and Politics in Congress? Monitor bills, track cosponsor momentum, and launch advocacy campaigns — all from one advocacy platform.