Curbing Abuse and Saving Expression In Technology Act or the CASE-IT Act This bill limits the protections for a user or provider of an interactive computer service (e.g., social media company) related to content that is published on or removed from its platform. The bill removes for one year the protection from being treated as the publisher of information provided by another content provider if a user or provider facilitates (1) illegal online content; (2) certain exploitive contact between adults and minors; or (3) content that is indecent, obscene, or otherwise harmful to minors. Further, to avoid being treated as the publisher of third-party content or subject to liability for screening and blocking content on its platform, an interactive computer service that is dominant in its market (i.e., has gained substantial, sustained market power over any competitors) must make content moderation decisions pursuant to policies or practices that are consistent with the First Amendment.
Get AI-generated questions to help you understand this bill better
Timeline
Introduced in House
Referred to the House Committee on Energy and Commerce.
Introduced in House
Referred to the House Committee on Energy and Commerce.
Science, Technology, Communications
Administrative law and regulatory proceduresChild safety and welfareCivil actions and liabilityComputers and information technologyCrimes against childrenDepartment of JusticeFederal Trade Commission (FTC)First Amendment rightsInternet and video servicesInternet, web applications, social mediaLicensing and registrationsSex offenses
CASE-IT Act
USA116th CongressHR-8719| House
| Updated: 10/30/2020
Curbing Abuse and Saving Expression In Technology Act or the CASE-IT Act This bill limits the protections for a user or provider of an interactive computer service (e.g., social media company) related to content that is published on or removed from its platform. The bill removes for one year the protection from being treated as the publisher of information provided by another content provider if a user or provider facilitates (1) illegal online content; (2) certain exploitive contact between adults and minors; or (3) content that is indecent, obscene, or otherwise harmful to minors. Further, to avoid being treated as the publisher of third-party content or subject to liability for screening and blocking content on its platform, an interactive computer service that is dominant in its market (i.e., has gained substantial, sustained market power over any competitors) must make content moderation decisions pursuant to policies or practices that are consistent with the First Amendment.
Administrative law and regulatory proceduresChild safety and welfareCivil actions and liabilityComputers and information technologyCrimes against childrenDepartment of JusticeFederal Trade Commission (FTC)First Amendment rightsInternet and video servicesInternet, web applications, social mediaLicensing and registrationsSex offenses