S. 1396: Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025
Sponsor
Maria Cantwell
Democrat ยท WA
Bill Progress
Latest Action ยท Apr 9, 2025
Read twice and Referred to Commerce, Science, and Transportation. for review
Deepfake bill targets big platforms, AI tools
Why it matters
As AI-generated images, audio, and video spread quickly online, this bill would set federal rules within 1 year for public education and within 2 years for content provenance tools and platform conduct.
S. 1396, introduced on 2025-04-09, tries to build a traceability system for digital content at a time when deepfakes and AI-edited media are becoming harder to spot. The bill defines a โdeepfakeโ as synthetic or synthetically-modified content that appears authentic to a reasonable person and creates a false understanding or impression. It also defines โcontent provenance informationโ as state-of-the-art, machine-readable information documenting the origin and history of digital content such as images, video, audio, or text.
The bill leans on the Commerce Department, especially the Under Secretary of Commerce for Standards and Technology, to stand up a public-private partnership and coordinate standards work with the Register of Copyrights and the Director of the USPTO. It also requires a public education campaign not later than 1 year after enactment covering synthetic content, deepfakes, watermarking, and content provenance. In parallel, grand challenges and prizes would be developed with DARPA and the National Science Foundation, signaling that the bill is not just about penalties but also about building workable technical systems.
The core compliance rules kick in 2 years after enactment. Companies that make commercial tools for creating synthetic or synthetically-modified content, or covered content, would have to let users include content provenance information and use reasonable security measures so that information stays machine-readable and is not easily removed or altered. The bill also makes it unlawful to knowingly remove, alter, tamper with, or disable provenance data in furtherance of an unfair or deceptive act, and it specifically bars covered platforms from removing or disabling that data except for security research purposes.
The reach of the platform rules is aimed at large services, not every website. A โcovered platformโ is a site or app available in the United States that has at least $50,000,000 in annual revenue or at least 25,000,000 monthly active users for at least 3 of the 12 months before a violation. The bill also goes after commercial AI training and generation practices: using covered content for a commercial purpose to train AI or generate synthetic content would be unlawful without express, informed consent and compliance with the copyright ownerโs terms and compensation. Enforcement would come from the FTC, state attorneys general, and private lawsuits by owners of covered content, who would get 4 years from discovery of a violation to sue for injunctive relief, compensatory damages, and reasonable attorneyโs fees.
What does S. 1396 do?
Public education campaign due within 1 year
The Under Secretary of Commerce for Standards and Technology must launch a public education campaign not later than 1 year after enactment to explain synthetic content, deepfakes, watermarking, and content provenance information.
New provenance compliance rules start after 2 years
Starting 2 years after enactment, commercial tool providers that create synthetic or synthetically-modified content must give users the ability to include content provenance information and must use reasonable security measures so the information remains machine-readable and is not easily removed or altered.
Large platforms covered at $50 million or 25 million users
The bill applies platform-specific rules to any website or app available in the United States that has at least $50,000,000 in annual revenue or at least 25,000,000 monthly active users for at least 3 of the 12 months preceding a violation, including social networks, video-sharing sites, search engines, and content aggregators.
Tampering with provenance data becomes unlawful
It would be unlawful to knowingly remove, alter, tamper with, or disable content provenance information in furtherance of an unfair or deceptive act, and covered platforms would also be barred from removing or disabling provenance information except for security research purposes.
Commercial AI training needs express informed consent
For any commercial purpose, a person could not use covered content with provenance information attachedโor content known to have had that information removedโto train artificial intelligence or generate synthetic content without express, informed consent and compliance with the copyright ownerโs terms and compensation.
Creators get private lawsuits with 4-year filing window
Owners of covered content may sue over violations of section 6(b) or 6(c) and seek declaratory or injunctive relief, compensatory damages, and reasonable litigation expenses and attorneyโs fees, with a statute of limitations of 4 years from when they discovered or should have discovered the violation.
Who benefits from S. 1396?
Copyright owners and creators
They gain a private right of action with a 4-year statute of limitations from discovery, plus access to compensatory damages, injunctions, and reasonable attorneyโs fees if provenance information is unlawfully tampered with or their covered content is used commercially for AI training without express, informed consent.
Everyday internet users
They benefit from a required public education campaign within 1 year after enactment and from stronger use of machine-readable provenance information designed to show the origin and history of digital content such as images, audio, video, and text.
Security researchers and technical auditors
The bill explicitly recognizes artificial intelligence red-teaming and blue-teaming and preserves an exception allowing covered platforms to remove or disable provenance information for security research purposes.
Developers building provenance and watermarking tools
They would likely see new demand because the Under Secretary of Commerce for Standards and Technology must create a public-private partnership and coordinate standards work with the Register of Copyrights, the USPTO Director, DARPA, and the National Science Foundation.
Who is affected by S. 1396?
Major social media, search, and video platforms
Platforms with at least $50,000,000 in annual revenue or at least 25,000,000 monthly active users for at least 3 of the previous 12 months would face new limits on removing or disabling content provenance information and possible FTC or state enforcement.
Commercial AI tool providers
Companies offering tools to create synthetic or synthetically-modified content for commercial purposes would have 2 years after enactment to add user-facing provenance functions and implement security measures that keep provenance data machine-readable and hard to strip out.
Businesses training AI on third-party content
Anyone using covered content for a commercial purpose to train AI or generate synthetic content would need express, informed consent and would also have to comply with the copyright ownerโs terms and compensation requirements.
State attorneys general and the FTC
The FTC would treat violations as unfair or deceptive acts under section 18(a)(1)(B) of the Federal Trade Commission Act, while state attorneys general could bring civil actions as parens patriae after notifying the FTC, unless the FTC already has an action pending against the same defendant for the same violation.
What Congress Is Saying
S. 1396 hasn't been debated on the floor yet.
This section updates when a legislator speaks about it on the floor or in committee.
S1396 Legislative Journey
Committee Action
Apr 9, 2025
Read twice and referred to the Committee on Commerce, Science, and Transportation.
About the Sponsor
Maria Cantwell
Democrat, WA ยท 33 years in Congress
Committees: Commerce, Science, and Transportation, Finance, Indian Affairs
View full profile โ
Cosponsors (2)
This bill has 2 cosponsors: 1 Democrat, 1 Republican, reflecting bipartisan support. Cosponsors represent 2 states: New Mexico, Tennessee.
Committee Sponsors
Commerce, Science, and Transportation Committee
1 of 28 committee members cosponsored
13 Democrats across this committee haven't cosponsored yet. Mobilize their constituents
S. 1396 Quick Facts
- Committee
- Commerce, Science, and Transportation
- Chamber
- Senate
- Policy
- Science, Technology, Communications
- Introduced
- Apr 9, 2025
Read twice and Referred to Commerce, Science, and Transportation. for review
Apr 9, 2025
Who is lobbying on S. 1396?
1 organization lobbying on this bill
AT&T SERVICES INC AND ITS AFFILIATES | 4 |
Showing 1-1 of 1 organizations
S. 1396 Bill Text
โTo require transparency with respect to content and content provenance information, to protect artistic content, and for other purposes. Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, SECTION 1. SHORT TITLE; TABLE OF CONTENTS. This Act may be cited as the โContent Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025โ.โ
Source: U.S. Government Publishing Office
Get notified when S. 1396 moves
Committee votes, floor action, cosponsor changes โ straight to your inbox.
Bill alerts + Legisletter's monthly briefing. Unsubscribe anytime.
Science, Technology, Communications Bills
9 related bills we're tracking
Kids Online Safety Act
Read twice and referred to the Committee on Commerce, Science, and Transportation. (Sponsor introductory remarks on measure: CR S2929-2930)
May 14, 2025
GUARDRAILS Act
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Mar 20, 2026
Artificial Intelligence Civil Rights Act of 2025
Referred to the Committee on Energy and Commerce, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Dec 2, 2025
TAKE IT DOWN Act
Became Public Law No: 119-12.
May 19, 2025
States' Right to Regulate AI Act
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Dec 17, 2025
ACERO Act
Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation.
Feb 24, 2026
ASCEND Act
Received in the Senate. Read twice. Placed on Senate Legislative Calendar under General Orders. Calendar No. 344.
Feb 24, 2026
GUARDRAILS Act
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Mar 26, 2026
Small Business Artificial Intelligence Advancement Act
Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation.
Feb 24, 2026
Trending Right Now
Bills gaining momentum across Congress
INSULIN Act of 2026
Read twice and referred to the Committee on Health, Education, Labor, and Pensions.
Mar 25, 2026
Congressional Tribute to Constance Baker Motley Act of 2025
Referred to the House Committee on Financial Services.
Sep 11, 2025
AI Foundation Model Transparency Act of 2026
Referred to the House Committee on Energy and Commerce.
Mar 26, 2026
Tracking Science, Technology, Communications in Congress? Monitor bills, track cosponsor momentum, and launch advocacy campaigns โ all from one advocacy platform.