Commerce, Science, and Transportation Committee, Science, Space, and Technology Committee
Introduced
In Committee
On Floor
Passed Chamber
Enacted
Identifying Outputs of Generative Adversarial Networks Act or the IOGAN Act (Sec. 3) This bill directs the National Science Foundation (NSF) and the National Institute of Standards and Technology (NIST) to support research on manipulated or synthesized media, including the output of generative adversarial networks. A generative adversarial network is a software system designed to be trained with authentic inputs (e.g., photographs) to generate similar, but artificial, outputs (e.g., deepfakes). Specifically, the NSF must support research on manipulated or synthesized content and information authenticity. (Sec. 4) NIST must support research for the development of measurements and standards necessary to accelerate the development of the technological tools to examine the function and outputs of generative adversarial networks or other technologies that synthesize or manipulate content. NIST shall conduct outreach to (1) receive input from private, public, and academic stakeholders on fundamental measurements and standards research necessary to examine the function and outputs of generative adversarial networks; and (2) consider the feasibility of an ongoing public and private sector engagement to develop voluntary standards for the function and outputs of such networks or other technologies that synthesize or manipulate content. (Sec. 5) The NSF and NIST must jointly submit to Congress a report containing (1) such agencies' findings with respect to the feasibility for research opportunities with the private sector, including digital media companies to detect the function and outputs of generative adversarial networks or other technologies that synthesize or manipulate content; and (2) any policy recommendations of those agencies that could facilitate and improve communication and coordination between the private sector, the NSF, and relevant federal agencies through the implementation of innovative approaches to detect digital content produced by such networks or such technologies.
Advanced technology and technological innovationsComputers and information technologyCongressional oversightDigital mediaGovernment studies and investigationsIntergovernmental relationsPhotography and imagingResearch administration and fundingResearch and developmentResearch ethicsTechnology assessment
IOGAN Act
USA116th CongressHR-4355| House
| Updated: 12/10/2019
Identifying Outputs of Generative Adversarial Networks Act or the IOGAN Act (Sec. 3) This bill directs the National Science Foundation (NSF) and the National Institute of Standards and Technology (NIST) to support research on manipulated or synthesized media, including the output of generative adversarial networks. A generative adversarial network is a software system designed to be trained with authentic inputs (e.g., photographs) to generate similar, but artificial, outputs (e.g., deepfakes). Specifically, the NSF must support research on manipulated or synthesized content and information authenticity. (Sec. 4) NIST must support research for the development of measurements and standards necessary to accelerate the development of the technological tools to examine the function and outputs of generative adversarial networks or other technologies that synthesize or manipulate content. NIST shall conduct outreach to (1) receive input from private, public, and academic stakeholders on fundamental measurements and standards research necessary to examine the function and outputs of generative adversarial networks; and (2) consider the feasibility of an ongoing public and private sector engagement to develop voluntary standards for the function and outputs of such networks or other technologies that synthesize or manipulate content. (Sec. 5) The NSF and NIST must jointly submit to Congress a report containing (1) such agencies' findings with respect to the feasibility for research opportunities with the private sector, including digital media companies to detect the function and outputs of generative adversarial networks or other technologies that synthesize or manipulate content; and (2) any policy recommendations of those agencies that could facilitate and improve communication and coordination between the private sector, the NSF, and relevant federal agencies through the implementation of innovative approaches to detect digital content produced by such networks or such technologies.
Advanced technology and technological innovationsComputers and information technologyCongressional oversightDigital mediaGovernment studies and investigationsIntergovernmental relationsPhotography and imagingResearch administration and fundingResearch and developmentResearch ethicsTechnology assessment