Judiciary Committee, Energy and Commerce Committee
Introduced
In Committee
On Floor
Passed Chamber
Enacted
The Kids Internet and Digital Safety Act (KIDS Act) establishes comprehensive regulations for online platforms, social gaming providers, and AI chatbot developers to protect minors and empower parents. It mandates safeguards against harmful content, establishes parental control tools, and prohibits certain data practices like profiling minors for market research. The bill also directs extensive research, public awareness campaigns, and the development of best practices to foster a safer online environment for children and teens. Enforcement of the Act will be carried out by the Federal Trade Commission (FTC) and State Attorneys General. Title I, the SCREEN Act , requires platforms with significant sexual material harmful to minors (under 17) to implement technology verification measures to prevent minor access, ensuring data protection and accuracy. Title II, Subtitle A, the Kids Online Safety Act , applies to platforms with user-generated content and personalized recommendations, mandating policies to address harms like threats of violence, sexual exploitation, and drug promotion. These platforms must provide minors with safeguards to limit communication, restrict geolocation, and control personalized recommendations, with default settings offering the highest protection. Additionally, parental tools must be available to manage privacy, purchases, and screen time for minors, with default protective settings for children under 13. This subtitle also requires clear reporting mechanisms for harms to minors and prohibits advertising illegal products. Platforms must undergo annual independent third-party audits to assess compliance and report findings to the FTC and the public, though the bill explicitly states it does not require age-gating. Title II, Subtitle B, the Safe Messaging for Kids Act , prohibits ephemeral messaging for all minors and direct messaging for children under 13. For teens (13-16), platforms must provide parental direct messaging controls , allowing parents to approve contacts, manage lists, disable messaging, and hide profiles, while protecting encryption integrity. Title II, Subtitle C, the Stop Profiling Youth and Kids Act , generally prohibits platforms from conducting market research on minors, with exceptions for improving safety or legal compliance. Title III, the Safer GAMING Act , requires online video game providers to implement parental safeguards for minors to limit communication, restrict purchases, and manage time spent, with default protective settings. Title IV, the SAFE BOTs Act , regulates AI chatbots, prohibiting false professional claims and mandating disclosures that the chatbot is an AI system. Chatbot providers must also implement policies to advise minors to take breaks and address harmful content. Title V focuses on research, education, and best practices , mandating several reports from the FTC and Department of Health and Human Services. These studies cover social media use by minors, fentanyl access, industry safety tools, and the mental health impacts of chatbots. The FTC is also directed to develop public awareness campaigns and educational resources for safe internet and chatbot use. Finally, a Kids Internet Safety Partnership will be established to identify online risks and benefits, and publish best practice playbooks for digital service providers. The bill includes important rules of construction to protect free speech, encryption, and existing privacy laws like COPPA. It also clarifies that the Act does not expand or limit Section 230 of the Communications Act.
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Commerce
KIDS Act
USA119th CongressHR-7757| House
| Updated: 3/3/2026
The Kids Internet and Digital Safety Act (KIDS Act) establishes comprehensive regulations for online platforms, social gaming providers, and AI chatbot developers to protect minors and empower parents. It mandates safeguards against harmful content, establishes parental control tools, and prohibits certain data practices like profiling minors for market research. The bill also directs extensive research, public awareness campaigns, and the development of best practices to foster a safer online environment for children and teens. Enforcement of the Act will be carried out by the Federal Trade Commission (FTC) and State Attorneys General. Title I, the SCREEN Act , requires platforms with significant sexual material harmful to minors (under 17) to implement technology verification measures to prevent minor access, ensuring data protection and accuracy. Title II, Subtitle A, the Kids Online Safety Act , applies to platforms with user-generated content and personalized recommendations, mandating policies to address harms like threats of violence, sexual exploitation, and drug promotion. These platforms must provide minors with safeguards to limit communication, restrict geolocation, and control personalized recommendations, with default settings offering the highest protection. Additionally, parental tools must be available to manage privacy, purchases, and screen time for minors, with default protective settings for children under 13. This subtitle also requires clear reporting mechanisms for harms to minors and prohibits advertising illegal products. Platforms must undergo annual independent third-party audits to assess compliance and report findings to the FTC and the public, though the bill explicitly states it does not require age-gating. Title II, Subtitle B, the Safe Messaging for Kids Act , prohibits ephemeral messaging for all minors and direct messaging for children under 13. For teens (13-16), platforms must provide parental direct messaging controls , allowing parents to approve contacts, manage lists, disable messaging, and hide profiles, while protecting encryption integrity. Title II, Subtitle C, the Stop Profiling Youth and Kids Act , generally prohibits platforms from conducting market research on minors, with exceptions for improving safety or legal compliance. Title III, the Safer GAMING Act , requires online video game providers to implement parental safeguards for minors to limit communication, restrict purchases, and manage time spent, with default protective settings. Title IV, the SAFE BOTs Act , regulates AI chatbots, prohibiting false professional claims and mandating disclosures that the chatbot is an AI system. Chatbot providers must also implement policies to advise minors to take breaks and address harmful content. Title V focuses on research, education, and best practices , mandating several reports from the FTC and Department of Health and Human Services. These studies cover social media use by minors, fentanyl access, industry safety tools, and the mental health impacts of chatbots. The FTC is also directed to develop public awareness campaigns and educational resources for safe internet and chatbot use. Finally, a Kids Internet Safety Partnership will be established to identify online risks and benefits, and publish best practice playbooks for digital service providers. The bill includes important rules of construction to protect free speech, encryption, and existing privacy laws like COPPA. It also clarifies that the Act does not expand or limit Section 230 of the Communications Act.
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.
Referred to the Committee on Energy and Commerce, and in addition to the Committee on the Judiciary, for a period to be subsequently determined by the Speaker, in each case for consideration of such provisions as fall within the jurisdiction of the committee concerned.