A new bill in the United States Senate hopes to protect artists and journalists from having their work used to train AI models or to generate AI content without their consent.The bill, called the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), was authored by Senate Commerce Committee Chair Maria Cantwell (D-WA), Senate AI Working Group member Martin Heinrich (D-NM) and Commerce Committee member Marsha Blackburn (R-TN).”Artificial intelligence has given bad actors the ability to create deepfakes of every individual, including those in the creative community, to imitate their likeness without their consent and profit off of counterfeit content,” Senator Blackburn said in a press release announcing the bill’s introduction. “The COPIED Act takes an important step to better defend common targets like artists and performers against deepfakes and other inauthentic content.” The bill comes at a time when many states are also looking at ways to combat specifically the use of AI to create false information.If passed, the bill would require companies to allow content owners to protect their work from being used to train AI models. It would also require the National Institute of Standards and Technology (NIST) to create guidelines and standards for adding content provenance information to content. Those standards would then be used to determine where the content came from and if it was generated or altered by AI.”The bipartisan COPIED Act I introduced with Senator Blackburn and Senator Heinrich will provide much-needed transparency around AI-generated content,” said Senator Cantwell. “The COPIED Act will also put creators, including local journalists, artists, and musicians, back in control of their content with a provenance and watermark process that I think is very much needed.”The bill would give individuals the right to sue for violations and authorize the Federal Trade Commission (FTC) and state attorneys to enforce its requirements. It would also make it illegal to remove, disable, or tamper with content provenance information.
Recommended by Our Editors
Several groups have already endorsed the bill including SAG-AFTRA, Nashville Songwriters Association International, Recording Academy, National Music Publishers’ Association, Recording Industry Association of America, News/Media Alliance, National Newspaper Association, America’s Newspapers, Rebuild Local News, Seattle Times, National Association of Broadcasters, Artist Rights Alliance, Human Artistry Campaign, Public Citizen, The Society of Composers & Lyricists, Songwriters Guild of America, and Music Creators North America.”Protecting the life’s work and legacy of artists has never been more important as AI platforms copy and use recordings scraped off the internet at industrial scale and AI-generated deepfakes keep multiplying rapidly. RIAA strongly supports provenance requirements as a fundamental building block for accountability and enforcement of creators’ rights, Mitch Glazier, Chairman and CEO of Recording Industry Association of America said in a statement.”Leading tech companies refuse to share basic data about the creation and training of their models as they profit from copying and using unlicensed copyrighted material to generate synthetic recordings that unfairly compete with original works. We appreciate Senators Cantwell, Blackburn, and Heinrich’s leadership with the Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2024, which would grant much needed visibility into AI development and pave the way for more ethical innovation and fair and transparent competition in the digital marketplace.”
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.