Federal Legislation

 

Passed Legislation

Identifying Outputs of Generative Adversarial Networks Act

U.S. Congressional Research Service

This bill directs the National Science Foundation (NSF) and the National Institute of Standards and Technology (NIST) to support research on generative adversarial networks. A generative adversarial network is a software system designed to be trained with authentic inputs (e.g., photographs) to generate similar, but artificial, outputs (e.g., deepfakes).

Specifically, the NSF must support research on manipulated or synthesized content and information authenticity and the NIST must support research for the development of measurements and standards necessary to accelerate the development of the technological tools to examine the function and outputs of generative adversarial networks or other technologies that synthesize or manipulate content.

 

Legislation in Consideration

Deepfake Report Act of 2019

U.S. Congressional Research Service

This bill requires the Science and Technology Directorate in the Department of Homeland Security to report at specified intervals on the state of digital content forgery technology. Digital content forgery is the use of emerging technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead.

 

DEEPFAKES Accountability Act

To protect national security against the threats posed by deepfake technology and to provide legal recourse to victims of harmful deepfakes.

The DEEPFAKES Accountability Act classifies deepfakes as videos or visual sources which include elements which are altered or sourced from generative AI. The proposal requires written disclosures for altered elements to be included in the deepfake. The act also prescribes criminal and civil penalties for parties which fail to implement alteration disclosures or purposely obscure or remove them. Next, the bill established an avenue for legal recourse, where those portrayed falsely by deepfakes lacking disclosure can bring civil litigation before a federal district court. Victims are entitled to compensation which is outlined in the bill. Lastly, the proposal tasks the U.S. Attorney General to meet with victims impersonated within deepfakes in order to develop further steps to protect the privacy of citizens.

 

DEFIANCE Act of 2024

To improve rights to relief for individuals affected by non-consensual activities involving intimate digital forgeries, and for other purposes.

The DEFIANCE Act of 2024 expands the federal law code on ‘Civil Action Relating To Disclosure of Intimate Images’ by defining digital forgeries, and outlining which victims included in explicit deepfakes without consent can initiate civil litigation. Specifically, the bill holds producers, distributors, solicitors, and owners of explicit deepfakes with intention to disclose them as liable parties facing punitive damages from an appropriate U.S. district court.

 

Protecting Consumers from Deceptive AI Act

To require the National Institute of Standards and Technology to establish task forces to facilitate and inform the development of technical standards and guidelines relating to the identification of content created by generative artificial intelligence, to ensure that audio or visual content created or substantially modified by generative artificial intelligence includes a disclosure acknowledging the generative artificial intelligence origin of such content, and for other purposes.

 

The Preventing Deep Fakes Scams Act

This bill establishes the Task Force on Artificial Intelligence in the Financial Services Sector. Members include representatives from the Department of the Treasury, the Federal Reserve Board, the Federal Deposit Insurance Corporation, and the National Credit Union Administration. The task force must report on (1) how banks and credit unions prevent fraud that utilizes artificial intelligence, (2) best practices for financial institutions to protect their customers, and (3) related legislative and regulatory recommendations.

 

The No AI Fraud Act

To provide for individual property rights in likeness and voice.

The No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI Fraud) Act of 2024 establishes legal definitions for an individual’s likeness and voice, along with definitions for audio manipulations such as digital depiction, personalized cloning service, and digital voice replica. Additionally, the bill outlines one’s property rights for their likeness and voice, and provides legal remedies for individuals when their voice or likeness is used in an ‘unauthorized simulation’. Underpinning these statutory designations, the bill initiates its action by presenting contemporary cases of the misuse of the likeness of prominent public figures, and presents statistics from Pew Research Center and the Department of Homeland Security on the issue.