Communications and Technology Subcommittee, Energy and Commerce Committee
Introduced
In Committee
On Floor
Passed Chamber
Enacted
Curbing Abuse and Saving Expression In Technology Act or the CASE-IT Act This bill limits liability protection, sometimes referred to as Section 230 protection, for a user or provider of an interactive computer service (e.g., social media company) related to content that is published on or removed from its platform. The bill removes for one year the protection from being treated as the publisher of information provided by another content provider if a user or provider facilitates (1) illegal online content; (2) certain exploitive contact between adults and minors; or (3) content that is indecent, obscene, or otherwise harmful to minors. Further, to avoid being treated as the publisher of third-party content or subject to liability for screening and blocking content on its platform, an interactive computer service that is dominant in its market (i.e., has gained substantial, sustained market power over any competitors) must make content moderation decisions pursuant to policies or practices that are consistent with the First Amendment.
Referred to the House Committee on Energy and Commerce.
Referred to the Subcommittee on Communications and Technology.
Science, Technology, Communications
CASE–IT Act
USA118th CongressHR-573| House
| Updated: 2/3/2023
Curbing Abuse and Saving Expression In Technology Act or the CASE-IT Act This bill limits liability protection, sometimes referred to as Section 230 protection, for a user or provider of an interactive computer service (e.g., social media company) related to content that is published on or removed from its platform. The bill removes for one year the protection from being treated as the publisher of information provided by another content provider if a user or provider facilitates (1) illegal online content; (2) certain exploitive contact between adults and minors; or (3) content that is indecent, obscene, or otherwise harmful to minors. Further, to avoid being treated as the publisher of third-party content or subject to liability for screening and blocking content on its platform, an interactive computer service that is dominant in its market (i.e., has gained substantial, sustained market power over any competitors) must make content moderation decisions pursuant to policies or practices that are consistent with the First Amendment.