Blog

Senate Judiciary Subcommittee on Intellectual Property Hearing Summary Entitled: The NO FAKES Act: Protecting Americans from Unauthorized Digital Replicas

Witnesses Included:

  • Lisa P. Ramsey, Professor of Law at University of San Diego School of Law
  • Graham Davies, President and Chief Executive Officer of Digital Media Association
  • Ben Sheffner, Senior Vice President and Associate General Counsel, Law and Public Policy, Motion Picture Association
  • Duncan Crabtree-Ireland, National Executive Director and Chief Negotiator, Screen Actors Guild-American Federal of Television and Radio Arts
  • Robert Kynel, Chief Executive Officer, Warner Music Group
  • Tahliah Debrett Barnett (aka “FKA twigs”)

On April 30th, the Senate Judiciary Committee Subcommittee on Intellectual Property held a hearing to discuss the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act and emerging issues related to deepfakes and its impact on politics, the entertainment industry and society at large. 

The bipartisan NO FAKES Act (sponsored by Senators Coons (D-DE), Blackburn (R-TN), Klobuchar (D-MN) and Tillis (R-NC)) would hold companies or individuals liable for producing unauthorized digital replicas of individuals. Additionally, the bill would make exceptions for some replicas based on first amendment protections and consider platforms prior knowledge of whether the content was indeed a deepfake. 

Deepfakes, and other uses of manipulated media, have become an issue due to their ability to mimic individual’s likenesses including their voice and face. The tool has been used to impersonate political figures, most notably an infamous robocall “from” President Biden telling New Hampshire voters to stay at home during January’s presidential election primary, as well as numerous cases of musical artists being impersonated by AI including Drake, Tupac, and FKA Twigs who was present at Tuesday’s meeting to testify on the issue. The Committee questioned witnesses on

  • Experiences with deepfakes; 
  • Technology considerations for misinformation mitigation;
  • Ethics behind banning deepfakes;
  • First amendment concerns surrounding limiting content creation
  • Technology applications that could assist in AI identification, 
  • The takedown rights provisions in the NO FAKES Act and whether they should be available only to those who commercialize their likeness or everyone and whether they should apply post-mortem; 
  • How this law affects section 230 rights for website owners and internet providers
  • The implementation of digital watermarks. 

Ultimate Takeaway

The witness panel, which included creators, participants in the entertainment and music industry, and academia were all in agreeance that action needed to be taken to mitigate the creep of AI deepfakes into entertainment and society at large. 

Regarding technical solutions for identification and protection of data, witnesses coalesced the idea of digital watermarks, a piece of code imbedded in an uploaded piece of media that typically provides copyright information. Senator Blumenthal (D-CT) discussed a bipartisan legislative framework he and Senator Hawley (R-MO) announced, that would create AI guardrails that includes the deployment of watermarks and addresses questions of AI deepfakes relationship with Section 230, a provision of the Communications Decency Act of 1996 that protects internet companies from liability in the case of illegal third-party content. Blumenthal’s bill asserts that AI content is not covered by Section 230 and that companies can be held liable for AI-related harm.

Unfortunately, there was no mention of utilizing algorithms to create a unique “hash” of a video file and then record it on the blockchain – a tool that The Digital Chamber is advocating for to mitigate AI misinformation risks.  In simple terms, when a comparison is needed to verify a video or image’s authenticity, one could simply compare the hash of the file with the one stored on the blockchain. If there is a match, the authenticity is verified. If not, there may have been an alteration using AI. This is just one example of how blockchain can be employed to combat against deepfakes.  

The Digital Chamber will keep you updated on our efforts to put forth policy recommendations related to the nexus of blockchain and AI. For more information on blockchain and AI, read our blog here.