BSV
$65.34
Vol 59.88m
-7.15%
BTC
$89371
Vol 51065.18m
-1.4%
BCH
$427.79
Vol 801.57m
-7.98%
LTC
$86.37
Vol 1446.82m
-9.67%
DOGE
$0.36
Vol 9630.65m
-1.16%
Getting your Trinity Audio player ready...

A bipartisan group of United States legislators has proposed a new law to protect content creators, and every other media industry organization supports it.

Sen. Maria Cantwell (D-Wash.) led the Senators who proposed the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act). Others who backed it included Marsha Blackburn (R-Tenn.) and Martin Heinrich (D-N.M.)—the latter is a member of the Senate’s AI Working Group.

The COPIED Act requires the National Institute of Standards and Technology (NIST) to develop new guidelines for the artificial intelligence (AI) sector, covering watermarking, the detection of AI-generated content and content provenance.

NIST’s standards would allow Americans to easily identify content generated by AI. With technology advancing rapidly in recent times, it has become harder to detect AI content, and this could have massive implications.

This includes impersonations and the use of deepfake audio and video to manipulate elections; in May, one study found that the ease of cloning Trump and Biden’s voices could dupe millions in this year’s U.S. elections. Microsoft and others have also warned that foreign nations could use AI to sway U.S. elections.

NIST’s standards will also offer content creators tools to digitally prove ownership of their content. This is critical in the second provision of the COPIED Act: requiring AI firms to seek consent before using any content with provenance information to train their AI.

“These measures give content owners—journalists, newspapers, artists, songwriters, and others—the ability to protect their work and set the terms of use for their content, including compensation,” the bill’s summary notes.

This provision will prevent OpenAI, Meta (NASDAQ: META), Google (NASDAQ: GOOGL) and other tech giants from using content to train AI without consulting or compensating its creators. This has been a major sticking point in the rise of AI, with some like the New York Times and Game of Thrones creator George R. R. Martin suing ChatGPT for using their content.

In defense, AI creators have argued that this content is covered by the fair use doctrine, which allows the use of copyrighted content in the promotion of freedom of expression. There are several lawsuits in court currently focused on whether the fair use doctrine applies to AI training.

But even if the courts side with AI companies, the COPIED Act would ensure an extra layer of protection for content creators as these firms would still have to seek consent for any content with provenance information.

Media industry backs COPIED Act

The COPIED Act joins dozens of others at the state and federal level seeking to police the fast-rising AI industry, including the NO FAKES Act.

It has received the support of several media industry organizations, from SAG-AFTRA and the Recording Academy to America’s Newspapers and the National Association of Broadcasters (NAB).

“As AI-generated music continues to disrupt the legitimate market, it is essential that listeners know where their music is coming from. Artists and songwriters deserve protection against unauthorized imitations and this legislation is an important step towards that goal,” commented David Israelite, the CEO of the National Music Publishers’ Association.

The COPIED Act stands a great chance at sailing through the Senate as it’s being pushed by the influential Sen. Cantwell, who heads the Senate Commerce Committee. Commenting on the bill, she said it would “provide much-needed transparency around AI-generated content.”

“The COPIED Act will also put creators, including local journalists, artists and musicians, back in control of their content with a provenance and watermark process that I think is very much needed,” she added.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: Digital currency regulation and the role of BSV blockchain

Recommended for you

French bank settles $108m digital bond with wholesale CBDC
The digitally native notes leveraged Euroclear's D-FMI for the tokenization aspects and Banque de France's pilot wholesale CBDC for the...
November 15, 2024
Pakistan warms up to digital assets with new bill
The new State Bank of Pakistan Act allows the top bank to issue a CBDC and legalizes digital currencies, but...
November 15, 2024
Advertisement
Advertisement
Advertisement