H.R. 6356: Artificial Intelligence Civil Rights Act of 2025
Sponsor
Yvette Clarke
Democrat · NY-9
Bill Progress
Latest Action · Dec 2, 2025
Referred to Energy and Commerce, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned. for review
Why it matters
As AI tools increasingly shape hiring, housing, credit, health care, and policing decisions, H.R. 6356 would create federal rules, audits, disclosures, and steep penalties aimed at stopping discrimination before it happens.
H.R. 6356, the Artificial Intelligence Civil Rights Act of 2025, would set up a broad federal civil rights framework for AI and other automated decision systems. It applies when a "covered algorithm" is used for a "consequential action" — meaning decisions with a material effect on employment, education, housing, utilities, health care, credit or banking, insurance, criminal justice or law enforcement, legal determinations, elections, government benefits, public accommodations, and other services the Federal Trade Commission decides are significant. The bill bars developers and deployers from using these systems in ways that cause disparate impact or otherwise discriminate based on protected characteristics such as race, color, ethnicity, national origin or immigration status, religion, sex including sexual orientation and gender identity, disability, limited English proficiency, biometric information, familial or marital status, source of income, income level, age, veteran status, and genetic information or medical conditions.
The bill's core approach is prevention plus paper trail. Before deployment, a company has to evaluate whether harm is plausible; if it is, the company must hire an independent auditor for a full evaluation. Deployers must then run annual impact assessments, and if harm is identified, they must again engage an independent auditor. Those evaluations and assessments must be sent to the FTC within 30 days of completion, and records must be kept for at least 10 years. The FTC would also have to write rules within 2 years of enactment covering evaluation factors, summary requirements, and later regulations creating opt-out rights for a human alternative and appeal mechanisms when covered algorithms are used for consequential actions.
The transparency rules are also unusually specific. Deployers would have to publicly disclose their practices, including contact information, data categories, and a statutory disclaimer explaining the limits of an audit. They also must give a short-form notice of no more than 500 words either when a person first interacts with the system or on the company's website. The FTC would have 90 days after enactment to publish a consumer rights web page, 18 months to study whether useful explanations for algorithm outputs are feasible and report back to Congress, and 180 days after the first annual report to create a public repository for evaluations and assessments. The bill also defines "covered language" as the 10 languages with the most speakers in the United States, based on U.S. Census Bureau data, signaling that language access is part of the compliance picture.
What gives the bill bite is enforcement. Violations would be treated as unfair or deceptive acts under the FTC Act. State attorneys general and data protection authorities could sue for civil penalties of $15,000 per violation or 4% of the company's average gross annual revenue over the previous 3 years, whichever is greater. Individuals could sue too, and recover treble damages or $15,000 per violation, whichever is greater, plus nominal damages, punitive damages, and attorney's fees. The bill also voids pre-dispute arbitration agreements and joint-action waivers for disputes under the Act, making it easier for people to get into court. On top of that, the Office of Personnel Management would have 270 days to create a new federal occupational series for algorithm auditing, and the FTC could hire up to 500 additional personnel to enforce the law.
What does H.R. 6356 do?
Bans AI discrimination in high-stakes decisions
The bill prohibits developers and deployers from using covered algorithms to cause disparate impact or otherwise discriminate in consequential actions, including employment, education, housing, utilities, health care, credit or banking, insurance, criminal justice or law enforcement, legal determinations, elections, government benefits, and public accommodations. Enforcement would run through the Federal Trade Commission.
Requires pre-launch reviews and annual audits
Before deploying a covered algorithm, a company must assess whether harm is plausible, and if it is, hire an independent auditor for a full evaluation. Deployers must also conduct annual impact assessments, submit completed evaluations to the FTC within 30 days, and retain records for at least 10 years.
FTC must write rules within 2 years
The Federal Trade Commission must issue rules within 2 years of enactment on evaluation factors and summary requirements, and within the same 2-year timeframe must also create regulations for opt-out rights to a human alternative and appeals for consequential actions.
Short notices capped at 500 words
Deployers must provide a concise short-form notice of no more than 500 words either at a person's first interaction with the system or on the deployer's website. Public disclosures must also include contact information, data categories, and a statutory disclaimer about the limitations of an audit.
Creates steep penalties and private lawsuits
State attorneys general and data protection authorities could seek civil penalties of $15,000 per violation or 4% of a company's average gross annual revenue over the preceding 3 years, whichever is greater. Individuals could sue for treble damages or $15,000 per violation, whichever is greater, plus nominal damages, punitive damages, and attorney's fees.
Builds federal AI audit capacity fast
The Director of the Office of Personnel Management must establish a new occupational series for algorithm auditing within 270 days of enactment. The bill also authorizes the FTC to hire up to 500 additional personnel.
Who benefits from H.R. 6356?
Workers and job applicants
People screened by hiring, firing, scheduling, or promotion algorithms would gain protections when those tools affect employment, one of the bill's listed consequential actions. They would also benefit from required annual assessments, possible human alternatives, and appeal rights that the FTC must regulate within 2 years.
Renters, borrowers, and insurance customers
Anyone subject to algorithmic decisions in housing, credit or banking, and insurance would get a legal path to challenge harmful systems. If a violation occurs, individuals could seek treble damages or $15,000 per violation, whichever is greater, along with attorney's fees.
People in protected classes
The bill explicitly protects people from AI-driven disparate impact based on race, color, ethnicity, national origin or immigration status, religion, sex including sexual orientation and gender identity, disability, limited English proficiency, biometric information, source of income, income level, age, veteran status, and genetic information or medical conditions.
Consumers with limited English access
The definition of covered language points to the 10 languages with the most speakers in the United States using U.S. Census Bureau data. That signals that disclosure and compliance systems must account for language access, not just English-only users.
Who is affected by H.R. 6356?
AI developers
Any person who designs, codes, customizes, produces, or substantially modifies an algorithm intended for consequential use would have to help support compliance, mitigate identified harms, and certify that the system is not likely to result in harm or deceptive practices.
Businesses deploying automated decision tools
Companies using covered algorithms for commercial acts would have to run pre-deployment evaluations, conduct annual impact assessments, submit reports to the FTC within 30 days of completion, keep records for at least 10 years, and provide public notices capped at 500 words.
State attorneys general and data protection authorities
State enforcers would gain authority to bring civil actions under the bill and seek penalties of $15,000 per violation or 4% of average gross annual revenue over the preceding 3 years, whichever is greater.
Federal agencies, especially FTC and OPM
The FTC would take on new rulemaking, disclosure, repository, and enforcement duties, including a consumer rights web page due within 90 days of enactment and a report to Congress within 18 months on explainability. OPM would have 270 days to create a new algorithm auditing job series.
What Congress Is Saying
H.R. 6356 hasn't been debated on the floor yet.
This section updates when a legislator speaks about it on the floor or in committee.
HR6356 Legislative Journey
House: Committee Action
Dec 2, 2025
Referred to the Committee on Energy and Commerce, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
About the Sponsor
Yvette Clarke
Democrat, Massachusetts's 5th congressional district · 19 years in Congress
Committees: Energy and Commerce
View full profile →
Cosponsors (26)
All 26 cosponsors are Democrats. Cosponsors represent 19 states: California, Connecticut, District of Columbia, and 16 more.
Summer Lee
Democrat · PA
Ayanna Pressley
Democrat · MA
Pramila Jayapal
Democrat · WA
Wesley Bell
Democrat · MO
André Carson
Democrat · IN
Judy Chu
Democrat · CA
Danny Davis
Democrat · IL
Christopher Deluzio
Democrat · PA
Jonathan Jackson
Democrat · IL
Robin Kelly
Democrat · IL
James McGovern
Democrat · MA
Eleanor Norton
Democrat · DC
Committee Sponsors
Oversight and Government Reform Committee
7 of 46 committee members cosponsored
Energy and Commerce Committee
2 of 54 committee members cosponsored
36 Democrats across these committees haven't cosponsored yet. Mobilize their constituents
H.R. 6356 Quick Facts
- Committee
- Oversight and Government Reform
- Chamber
- House
- Policy
- Science, Technology, Communications
- Introduced
- Dec 2, 2025
Referred to Energy and Commerce, and in addition to the Committee on Oversight and Government Reform, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned. for review
Dec 2, 2025
H.R. 6356 Common Questions
How much is the penalty for AI discrimination under HR 6356?
Under the Artificial Intelligence Civil Rights Act of 2025, states can seek $15,000 per violation or 4% of a company’s average gross annual revenue from the prior 3 years, whichever is greater (Section 402).
Can individuals sue companies for discriminatory AI decisions under HR 6356?
Yes. Under the Artificial Intelligence Civil Rights Act of 2025, individuals can sue for treble damages or $15,000 per violation, whichever is greater, plus other damages and fees (Section 403).
Does the AI Civil Rights Act require annual audits of hiring or credit algorithms?
Yes. According to HR 6356 Section 102, deployers must conduct annual impact assessments for covered algorithms used in consequential decisions, and higher-risk systems may require an independent auditor.
How long do companies have to send AI audit results to the FTC under HR 6356?
Under the Artificial Intelligence Civil Rights Act of 2025, completed evaluations, impact assessments, and annual reviews must be submitted to the FTC within 30 days after completion (Section 102).
How long must companies keep AI audit and disclosure records under HR 6356?
HR 6356 generally requires evaluations, assessments, reviews, and prior disclosure versions to be kept for at least 10 years; developer-deployer contracts also carry a 10-year retention rule (Sections 102, 202, 301).
What are the protected characteristics covered by the AI Civil Rights Act of 2025?
Under HR 6356, protected traits include race, immigration status, sex including sexual orientation and gender identity, disability, limited English proficiency, biometric information, age, veteran status, and more (Section 2).
Can people opt out of an AI decision and request a human review under HR 6356?
Potentially yes. The Artificial Intelligence Civil Rights Act of 2025 directs the FTC to issue rules on human alternatives and appeal mechanisms within 2 years of enactment (Section 203).
How long can an AI disclosure notice be under HR 6356?
Under the Artificial Intelligence Civil Rights Act of 2025, the required short-form notice can be no more than 500 words (Section 301).
Which languages count as covered language under HR 6356?
According to HR 6356 Section 2, covered language means the 10 languages with the most speakers in the United States, based on the most recent Census Bureau data.
Does HR 6356 apply in Puerto Rico, Guam, and the U.S. Virgin Islands?
Yes. Under the Artificial Intelligence Civil Rights Act of 2025, “State” includes D.C., Puerto Rico, the U.S. Virgin Islands, Guam, American Samoa, and the Northern Mariana Islands (Section 2).
Based on H.R. 6356 bill text
H.R. 6356 Bill Text
“To establish protections for individual rights with respect to computational algorithms, and for other purposes.”
Source: U.S. Government Publishing Office
Get notified when H.R. 6356 moves
Committee votes, floor action, cosponsor changes — straight to your inbox.
Bill alerts + Legisletter's monthly briefing. Unsubscribe anytime.
Science, Technology, Communications Bills
9 related bills we're tracking
GUARDRAILS Act
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Mar 20, 2026
TAKE IT DOWN Act
Became Public Law No: 119-12.
May 19, 2025
States' Right to Regulate AI Act
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Dec 17, 2025
ACERO Act
Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation.
Feb 24, 2026
ASCEND Act
Received in the Senate. Read twice. Placed on Senate Legislative Calendar under General Orders. Calendar No. 344.
Feb 24, 2026
GUARDRAILS Act
Read twice and referred to the Committee on Commerce, Science, and Transportation.
Mar 26, 2026
Small Business Artificial Intelligence Advancement Act
Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation.
Feb 24, 2026
Research Integrity and Foreign Influence Prevention Act
Referred to the House Committee on Science, Space, and Technology.
Jun 5, 2025
DOE and NASA Interagency Research Coordination Act
Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation.
Mar 25, 2025
Trending Right Now
Bills gaining momentum across Congress
Congressional Tribute to Constance Baker Motley Act of 2025
Referred to the House Committee on Financial Services.
Sep 11, 2025
Deterring American AI Model Theft Act of 2026
Referred to the House Committee on Foreign Affairs.
Apr 15, 2026
AI Foundation Model Transparency Act of 2026
Referred to the House Committee on Energy and Commerce.
Mar 26, 2026
Tracking Science, Technology, Communications in Congress? Monitor bills, track cosponsor momentum, and launch advocacy campaigns — all from one advocacy platform.