Relating to criminal and civil liability related to sexually explicit media and artificial intimate visual material; creating a criminal offense; increasing a criminal penalty.
CriticalImmediate action required
High Cost
Effective:2025-09-01
Enforcing Agencies
Texas Attorney General (Injunctive relief/DTPA enforcement) • Local District Attorneys (Criminal prosecution) • Civil Courts (Private right of action)
01
Compliance Analysis
Key implementation requirements and action items for compliance with this legislation
Implementation Timeline
Effective Date: September 1, 2025
Compliance Deadline:September 1, 2025 (Systems must be fully operational on this date; a removal request received on Day 1 triggers the 72-hour clock).
Agency Rulemaking: No specific agency rulemaking is mandated; the statute is self-executing through the courts and the Attorney General. This means judicial precedent, not agency guidance, will define enforcement standards.
Immediate Action Plan
Audit User Content Capabilities: Identify every touchpoint where a user can upload an image or video to your platform.
Build the "Takedown" Workflow: Establish a dedicated channel for these requests that bypasses standard support queues to ensure the 72-hour deadline is met.
Update Terms of Service: Insert specific prohibitions regarding Artificial Intimate Visual Material immediately.
Review Insurance Policies: Verify with your broker that your Cyber/Media Liability policy covers "failure to remove content" and does not exclude these claims as "intentional acts."
Operational Changes Required
Contracts
Terms of Service (ToS): AI developers must explicitly prohibit the creation of deep fake media in user agreements *prior* to granting access to qualify for affirmative defenses.
Vendor SLAs: Contracts with third-party content moderators must be amended to guarantee review and removal actions within 48 hours to ensure the business meets the statutory 72-hour deadline.
Model Releases: Standard releases are insufficient for digital replicas. You must utilize a new "Plain Language" consent agreement specifically describing the deep fake media and its intended use.
Hiring/Training
Moderation Staffing: Operational hours must likely expand to include weekends or on-call rotations. The 72-hour removal deadline does not pause for weekends or holidays.
Technical Training: Staff must be trained to distinguish between "authentic" and "artificial" visual material and how to utilize hash-matching tools to identify identical copies.
Reporting & Record-Keeping
Removal Request Portal: You must deploy an easily accessible, user-facing system specifically for reporting artificial intimate material.
Compliance Logs: To defend against DTPA claims, maintain immutable logs documenting:
1. Exact time of request receipt.
2. Exact time of content removal.
3. Technical steps taken to identify and remove identical copies across the platform.
Fees & Costs
Operational Costs: Expect increased costs for 24/7 moderation coverage and implementation of hash-matching/content-scanning software.
Litigation Risk: Violation of the removal provision constitutes a Deceptive Trade Practice, exposing the business to treble (3x) damages and mandatory attorneys' fees.
Strategic Ambiguities & Considerations
"Reasonable Efforts": The law requires "reasonable efforts" to identify and remove identical copies of the reported material. It does not define what constitutes "reasonable." Plaintiffs will argue that failure to use the most advanced AI detection software is unreasonable.
"Reckless Facilitation": Payment processors and cloud hosts face liability if they "recklessly facilitate" the conduct. The threshold for "reckless" in a B2B infrastructure context is currently undefined and will be litigated aggressively.
"Indistinguishable": Liability hinges on material being "indistinguishable from an authentic visual depiction." This subjective standard creates a gray zone for stylized or lower-quality AI generations.
Need Help Understanding Implementation?
Our government affairs experts can walk you through this bill's specific impact on your operations.
Information presented is for general knowledge only and is provided without warranty, express or implied. Consult qualified government affairs professionals and legal counsel before making compliance decisions.
The misuse of AI to create nonconsensual intimate visual materials, often called "deepfakes," has grown exponentially, with reports indicating that 96% of deepfake videos online are pornographic, targeting women disproportionately. In 2023, cybersecurity researchers found that searches for "deepfake pornography" increased by 900% within two years. Victims face severe emotional and reputational harm, compounded by the lack of clear legal remedies.
The issue escalates with platforms hosting nudification applications, which use AI to digitally undress photos without consent. Current laws fail to address these AI-specific harms adequately. S.B. 441 builds on prior initiatives, which criminalizes nonconsensual disclosure of intimate images but does not account for AI-generated content. This bill closes that gap by addressing the unique challenges of AI misuse in creating and distributing intimate visual materials.
S.B. 441 addresses the rise of nonconsensual artificial intimate visual material created using AI by expanding civil liability laws. It imposes penalties on individuals, websites, and payment processors involved in producing or distributing such content without consent. The bill prioritizes victim privacy through confidentiality measures and ensures a lengthy statute of limitations for pursuing justice. By modernizing existing laws, S.B. 441 aims to combat the misuse of AI in generating harmful, intimate visual materials.
As proposed, S.B. 441 amends current law relating to civil liability for the production, solicitation, disclosure, or promotion of artificial intimate visual material.
RULEMAKING AUTHORITY
This bill does not expressly grant any additional rulemaking authority to a state officer, institution, or agency.
SECTION BY SECTION ANALYSIS
SECTION 1. Amends the heading to Chapter 98B, Civil Practice and Remedies Code, to read as follows:
CHAPTER 98B. UNLAWFUL PRODUCTION, SOLICITATION, DISCLOSURE, OR PROMOTION OF INTIMATE VISUAL MATERIAL
SECTION 2. Amends Section 98B.001, Civil Practice and Remedies Code, by amending Subdivision (1) and adding Subdivisions (1-a), (3), and (4) to define "artificial intimate visual material," "nudification application," and "social media platform."
SECTION 3. Amends Chapter 98B, Civil Practice and Remedies Code, by adding Sections 98B.0021, 98B.0022, 98B.008, and 98B.009, as follows:
Sec. 98B.0021. LIABILITY FOR UNLAWFUL PRODUCTION, SOLICITATION, DISCLOSURE, OR PROMOTION OF CERTAIN ARTIFICIAL INTIMATE VISUAL MATERIAL. Provides that a defendant is liable, as provided by Chapter 98B (Unlawful Disclosure or Promotion of Intimate Visual Material), to a person depicted in artificial intimate visual material for damages arising from the production, solicitation, disclosure, or promotion of the material if:
(1) the defendant produces, solicits, discloses, or promotes the artificial intimate visual material without the effective consent of the depicted person and with the intent to harm that person;
(2) the production, solicitation, disclosure, or promotion of the artificial intimate visual material causes harm to the depicted person; and
(3) the production, solicitation, disclosure, or promotion of the artificial intimate visual material reveals the identity of the depicted person in any manner, including through any accompanying or subsequent information or material related to the artificial intimate visual material or information or material provided by a third party in response to the disclosure of the artificial intimate visual material.
Sec. 98B.0022. LIABILITY OF OWNERS OF INTERNET WEBSITES AND ARTIFICIAL INTELLIGENCE APPLICATIONS AND PAYMENT PROCESSORS. (a) Provides that a person who owns an Internet website, including a social media platform, on which artificial intimate visual material is produced or disclosed in exchange for payment or a publicly accessible nudification application from which the material is produced, and any person who processes or facilitates payment for the production or disclosure of the material through the website or application, is liable, as provided by this chapter, to a person depicted in the material for damages arising from the production or disclosure of the material if the person knows or recklessly disregards that the depicted person did not consent to the production or disclosure of the material.
(b) Provides that a person who owns an Internet website, including a social media platform, on which artificial intimate visual material is disclosed is liable, as provided by this chapter, to the person depicted in the material for damages arising from the disclosure of the material if the person depicted requests the website to remove the material and the website fails to do so within 72 hours after the request is made.
Sec. 98B.008. CONFIDENTIAL IDENTITY IN CERTAIN ACTIONS. (a) Defines "confidential identity."
(b) Requires the court, except as otherwise provided by this section, in a suit brought under this chapter, to make it known to the claimant as early as possible in the proceedings of the suit that the claimant is authorized to use a confidential identity in relation to the suit, allow a claimant to use a confidential identity in all petitions, filings, and other documents presented to the court, use the confidential identity in all of the court's proceedings and records relating to the suit, including any appellate proceedings, and maintain the records relating to the suit in a manner that protects the confidentiality of the claimant.
(c) Provides that, in a suit brought under this chapter, only certain persons are entitled to know the true identifying information about the claimant.
(d) Requires the court to order that a person entitled to know the true identifying information under Subsection (c) is prohibited from divulging that information to anyone without a written order of the court. Requires a court to hold a person who violates the order in contempt.
(e) Prohibits the Supreme Court of Texas, notwithstanding Section 22.004 (Rules of Civil Procedure), Government Code, from amending or adopting rules in conflict with this section.
(f) Provides that a claimant is not required to use a confidential identity as provided by this section.
Sec. 98B.009. STATUTE OF LIMITATIONS. Requires a person to bring suit under this chapter not later than 10 years after the later of the date on which the person depicted in the intimate visual material that is the basis for the suit reasonably discovers the intimate visual material or the person depicted in the intimate visual material that is the basis for the suit turns 18 years of age.
SECTION 4. Makes application of this Act prospective.
Honorable Pete Flores, Chair, Senate Committee on Criminal Justice
FROM:
Jerry McGinty, Director, Legislative Budget Board
IN RE:
SB441 by Hinojosa, Juan "Chuy" (Relating to civil liability for the production, solicitation, disclosure, or promotion of artificial intimate visual material.), As Introduced
The fiscal implications of the bill cannot be determined due to the lack of case-level data.
The bill establishes a civil liability for the production, solicitation, disclosure, or promotion of artificial intimate visual material and holds liable those who produce, solicit, disclose, or promote such material without the effective consent of the depicted person and with the intent to harm that person or reveal the identity of the depicted person in any manner.
According to the Office of Court Administration, the cost cannot be determined since the bill creates new causes of actions for which there is no case-level data.
Local Government Impact
The fiscal implications of the bill cannot be determined at this time.
Source Agencies: b > td >
212 Office of Court Administration, Texas Judicial Council
LBB Staff: b > td >
JMc, MGol, DA, JPa
Related Legislation
Explore more bills from this author and on related topics
SB441 creates strict civil liability and Deceptive Trade Practices Act (DTPA) exposure for any digital platform, payment processor, or AI developer that hosts, generates, or facilitates non-consensual artificial intimate visual material. Effective September 1, 2025, businesses allowing user-generated content must implement a 72-hour removal protocol and update Terms of Service to avoid uncapped damages and treble penalties. Implementation Timeline Effective Date: September 1, 2025 Compliance Deadline: September 1, 2025 (Systems must be fully operational on this date; a removal request received on Day 1 triggers the 72-hour clock).
Q
Who authored SB441?
SB441 was authored by Texas Senator Juan Hinojosa during the Regular Session.
Q
When was SB441 signed into law?
SB441 was signed into law by Governor Greg Abbott on June 20, 2025.
Q
Which agencies enforce SB441?
SB441 is enforced by Texas Attorney General (Injunctive relief/DTPA enforcement), Local District Attorneys (Criminal prosecution) and Civil Courts (Private right of action).
Q
How urgent is compliance with SB441?
The compliance urgency for SB441 is rated as "critical". Businesses and organizations should review the requirements and timeline to ensure timely compliance.
Q
What is the cost impact of SB441?
The cost impact of SB441 is estimated as "high". This may vary based on industry and implementation requirements.
Q
What topics does SB441 address?
SB441 addresses topics including business & commerce, business & commerce--trade practices, civil remedies & liabilities, courts and courts--general.
Legislative data provided by LegiScanLast updated: November 25, 2025
Need Strategic Guidance on This Bill?
Need help with Government Relations, Lobbying, or compliance? JD Key Consulting has the expertise you're looking for.