The "Shielding Children's Retinas from Egregious Exposure on the Net Act," or SCREEN Act , aims to protect minors from online content deemed harmful. Congress finds that previous legislative efforts were struck down, and filtering software has proven ineffective, leading to significant minor exposure to pornography and associated psychological harms. The bill asserts a compelling government interest in shielding minors and views mandatory age verification by platforms as the least restrictive means to achieve this. The Act defines a "covered platform" as an interactive computer service that regularly creates, hosts, or makes available content "harmful to minors" for profit. This content includes visual depictions appealing to prurient interest, depicting sexual acts offensively, and lacking serious value for minors, as well as obscene content or child pornography. Beginning one year after enactment, these platforms must adopt and utilize technology verification measures to ensure users are not minors and prevent their access to harmful content. These verification measures must actively determine a user's age, going beyond simple self-confirmation, and apply to all users, including those with VPNs, unless they are outside the U.S. Platforms must publicly disclose their verification process and ensure robust data security for collected verification data, retaining it only as long as necessary. The Federal Trade Commission (FTC) is mandated to enforce these requirements, conducting regular audits and issuing guidance within 180 days of enactment, consulting with various experts. The Government Accountability Office (GAO) will report to Congress within two years on the effectiveness and impacts of these measures.
The "Shielding Children's Retinas from Egregious Exposure on the Net Act," or SCREEN Act , aims to protect minors from online content deemed harmful. Congress finds that previous legislative efforts were struck down, and filtering software has proven ineffective, leading to significant minor exposure to pornography and associated psychological harms. The bill asserts a compelling government interest in shielding minors and views mandatory age verification by platforms as the least restrictive means to achieve this. The Act defines a "covered platform" as an interactive computer service that regularly creates, hosts, or makes available content "harmful to minors" for profit. This content includes visual depictions appealing to prurient interest, depicting sexual acts offensively, and lacking serious value for minors, as well as obscene content or child pornography. Beginning one year after enactment, these platforms must adopt and utilize technology verification measures to ensure users are not minors and prevent their access to harmful content. These verification measures must actively determine a user's age, going beyond simple self-confirmation, and apply to all users, including those with VPNs, unless they are outside the U.S. Platforms must publicly disclose their verification process and ensure robust data security for collected verification data, retaining it only as long as necessary. The Federal Trade Commission (FTC) is mandated to enforce these requirements, conducting regular audits and issuing guidance within 180 days of enactment, consulting with various experts. The Government Accountability Office (GAO) will report to Congress within two years on the effectiveness and impacts of these measures.