BSV
$53.87
Vol 32.58m
-4.65%
BTC
$96710
Vol 44728.2m
-1.82%
BCH
$455.21
Vol 387.74m
-2.71%
LTC
$101.23
Vol 894.59m
-1.69%
DOGE
$0.31
Vol 5674.52m
-5.49%
Getting your Trinity Audio player ready...

A bipartisan group of United States legislators has proposed a new law to protect content creators, and every other media industry organization supports it.

Sen. Maria Cantwell (D-Wash.) led the Senators who proposed the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act). Others who backed it included Marsha Blackburn (R-Tenn.) and Martin Heinrich (D-N.M.)—the latter is a member of the Senate’s AI Working Group.

The COPIED Act requires the National Institute of Standards and Technology (NIST) to develop new guidelines for the artificial intelligence (AI) sector, covering watermarking, the detection of AI-generated content and content provenance.

NIST’s standards would allow Americans to easily identify content generated by AI. With technology advancing rapidly in recent times, it has become harder to detect AI content, and this could have massive implications.

This includes impersonations and the use of deepfake audio and video to manipulate elections; in May, one study found that the ease of cloning Trump and Biden’s voices could dupe millions in this year’s U.S. elections. Microsoft and others have also warned that foreign nations could use AI to sway U.S. elections.

NIST’s standards will also offer content creators tools to digitally prove ownership of their content. This is critical in the second provision of the COPIED Act: requiring AI firms to seek consent before using any content with provenance information to train their AI.

“These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation,” the bill’s summary notes.

This provision will prevent OpenAI, Meta (NASDAQ: META), Google (NASDAQ: GOOGL) and other tech giants from using content to train AI without consulting or compensating its creators. This has been a major sticking point in the rise of AI, with some like the New York Times and Game of Thrones creator George R. R. Martin suing ChatGPT for using their content.

In defense, AI creators have argued that this content is covered by the fair use doctrine, which allows the use of copyrighted content in the promotion of freedom of expression. There are several lawsuits in court currently focused on whether the fair use doctrine applies to AI training.

But even if the courts side with AI companies, the COPIED Act would ensure an extra layer of protection for content creators as these firms would still have to seek consent for any content with provenance information.

Media industry backs COPIED Act

The COPIED Act joins dozens of others at the state and federal level seeking to police the fast-rising AI industry, including the NO FAKES Act.

It has received the support of several media industry organizations, from SAG-AFTRA and the Recording Academy to America’s Newspapers and the National Association of Broadcasters (NAB).

“As AI-generated music continues to disrupt the legitimate market, it is essential that listeners know where their music is coming from. Artists and songwriters deserve protection against unauthorized imitations and this legislation is an important step towards that goal,” commented David Israelite, the CEO of the National Music Publishers’ Association.

The COPIED Act stands a great chance at sailing through the Senate as it’s being pushed by the influential Sen. Cantwell, who heads the Senate Commerce Committee. Commenting on the bill, she said it would “provide much-needed transparency around AI-generated content.”

“The COPIED Act will also put creators, including local journalists, artists and musicians, back in control of their content with a provenance and watermark process that I think is very much needed,” she added.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Digital currency regulation and the role of BSV blockchain

Recommended for you

UK’s FCA releases paper on digital asset disclosures, abuse
The FCA's paper tackles the future market abuse regime for cryptoassets and the digital asset admissions and disclosures regime, which...
December 20, 2024
Indonesia concludes proof of concept for wholesale CBDC
Bank Indonesia finalized the PoC for the digital rupiah; meanwhile, Société Générale and Banque de France concluded a blockchain repo...
December 20, 2024
Advertisement
Advertisement
Advertisement