The Language-Inclusive Support and Transparency for Online Services Act of 2025, or LISTOS Act, addresses the historical under-investment by large online platforms in non-English content moderation. Congress finds that this disparity leads to inconsistent enforcement of platform policies and an increased proliferation of harmful content targeting non-English-speaking communities, underscoring the need for equitable investment across languages to promote safety and opportunity. The bill requires operators of covered platforms —those with at least 10 million monthly active users that allow user-generated content—to ensure that processes for detecting, suppressing, and removing illegal or policy-violating content are reasonably consistent across all languages in which the platform engages in monetization practices . This duty considers factors like staffing levels and the effectiveness of automated systems, with exceptions for end-to-end encrypted messaging services and languages used by fewer than 100,000 users in the U.S. To enhance transparency, covered platforms must submit annual public reports to the Federal Trade Commission (FTC) detailing their manual and algorithmic content moderation efforts. These reports must include information on content moderation staffing, broken down by location, geographic assignment, and language proficiency, along with descriptions of staff training and support services. Platforms must also disclose performance metrics and safeguards for automated content detection systems across languages. Further reporting requirements include a list of monetized languages with revenue breakdowns, the percentage of content reviewed in its original language versus automatically translated, and descriptions of translation and review processes. Platforms must also provide content moderation outcome measures, such as the number of content takedowns and average response times for user requests, for each monetized language. Additionally, platforms must ensure all user tools for reporting content and all platform policies are consistently accessible across all languages they offer. The Act also establishes an Advisory Group on Language-Sensitive Technologies to provide guidance to the FTC on best practices for technologies whose performance may vary by language, including those used for natural language processing and automated content decisions. The FTC is responsible for enforcing the Act, treating violations as unfair or deceptive acts or practices, and State Attorneys General are also empowered to bring civil actions. The FTC will promulgate regulations to implement key provisions of the Act, with specific effective dates for compliance.
Read twice and referred to the Committee on Commerce, Science, and Transportation.
LISTOS Act of 2025
USA119th CongressS-3540| Senate
| Updated: 12/17/2025
The Language-Inclusive Support and Transparency for Online Services Act of 2025, or LISTOS Act, addresses the historical under-investment by large online platforms in non-English content moderation. Congress finds that this disparity leads to inconsistent enforcement of platform policies and an increased proliferation of harmful content targeting non-English-speaking communities, underscoring the need for equitable investment across languages to promote safety and opportunity. The bill requires operators of covered platforms —those with at least 10 million monthly active users that allow user-generated content—to ensure that processes for detecting, suppressing, and removing illegal or policy-violating content are reasonably consistent across all languages in which the platform engages in monetization practices . This duty considers factors like staffing levels and the effectiveness of automated systems, with exceptions for end-to-end encrypted messaging services and languages used by fewer than 100,000 users in the U.S. To enhance transparency, covered platforms must submit annual public reports to the Federal Trade Commission (FTC) detailing their manual and algorithmic content moderation efforts. These reports must include information on content moderation staffing, broken down by location, geographic assignment, and language proficiency, along with descriptions of staff training and support services. Platforms must also disclose performance metrics and safeguards for automated content detection systems across languages. Further reporting requirements include a list of monetized languages with revenue breakdowns, the percentage of content reviewed in its original language versus automatically translated, and descriptions of translation and review processes. Platforms must also provide content moderation outcome measures, such as the number of content takedowns and average response times for user requests, for each monetized language. Additionally, platforms must ensure all user tools for reporting content and all platform policies are consistently accessible across all languages they offer. The Act also establishes an Advisory Group on Language-Sensitive Technologies to provide guidance to the FTC on best practices for technologies whose performance may vary by language, including those used for natural language processing and automated content decisions. The FTC is responsible for enforcing the Act, treating violations as unfair or deceptive acts or practices, and State Attorneys General are also empowered to bring civil actions. The FTC will promulgate regulations to implement key provisions of the Act, with specific effective dates for compliance.