The White House Struck a Deal With A.I. Companies to Manage the Technology’s Risks. Artists Say It ‘Does Nothing’ to Protect Them

0
11

Last week, the White House announced that it had struck a deal with several artificial intelligence companies to “manage the risks” posed by the technology. Artists, however, say the deal “does nothing” to protect creatives.

The administration of President Joe Biden on Friday secured “voluntary commitments” from Amazon, Anthropic, Google, Inflection, Meta, Microsoft, and ChatGPT creator OpenAI to “help move toward safe, secure, and transparent development of A.I. technology.”

The deal calls on the companies to agree to ensure products are safe before launching them and will not pose biosecurity or cybersecurity risks. The companies agreed to address “societal risks” including bias and discrimination. It further calls on the companies to build trust with the public by using watermarking and other labels to identify audio and visuals that have been generated using A.I. technology.

But the deal made no reference to the challenges artists fear in the age of A.I. such as job loss and theft of intellectual property. Artnet News spoke with several artists and art-world stakeholders who have been involved in addressing such challenges.

“This document doesn’t say all that much,” said Mathew Dryhurst.

Dryhurst and fellow artist Holly Herndon have launched tools to help artists in the age of A.I. Those tools include Spawning, which lets people set permissions on how their style and likeness can be used, and HaveIBeenTrained, which allows artists to see if their work was used to train popular A.I. art models.

“It’s disappointing that the Biden administration’s deal with A.I. corporations does nothing to protect creators and does not even acknowledge the mass act of theft these generators are built on,” said Molly Crabapple, who published an open letter in May calling for publishers to restrict the use of A.I.-generated illustrations. She predicted that only the most elite illustrators will survive the shifts in the industry.

“At a moment when SAF-AFTRA and WGA are on strike, in part due to existential risk these generators pose to all creative industries, this silence is glaring,” Crabapple added. “[Biden] clearly doesn’t view it as a priority. Artists like Karla Ortiz have been working very hard to educate Congress. There are major lawsuits, not just by her, but by celebrities like Sarah Silverman. The administration just isn’t listening.”

Silverman, a stand-up comedian and actress, has filed a lawsuit against OpenAI and Meta over claims the companies violated her copyright by training their chatbots ChatGPT and LLaMA on the text of her book . She was joined in the lawsuit by authors Christopher Golden and Richard Kadrey.

“We represent American creators—including authors, visual artists, and programmers—whose work has been misused by A.I. companies as training data without consent, credit, or compensation,” Matthew Butterick, a lawyer representing Silverman, said in emailed comments to Artnet News.

Butterick noted that three of the seven A.I. companies involved in the White House announcement “are already defendants in litigation on this issue.”

“Notably, these companies have not yet committed to dataset transparency, which is critical to making A.I. fair and ethical for everyone—especially the human creators whose work makes these systems so valuable,” he said.

Ben Zhao, a professor of computer science at the University of Chicago, is one of the professors who led a research team that developed Glaze—a technology that allows artists to prevent A.I. platforms from stealing their artistic style by making subtle shifts to brushstrokes and palette.

Zhao said the Biden administration “has well intentions” but are “oblivious to the real risks” happening such as the misappropriation of content without consent, compensation, or credit. He said such risks already devalue entire human industries, destroy the lives of artists, and disincentivize future generations of artists, writers, and musicians.

“The current discussion only includes stakeholders on one side of the struggle and is missing the voices of the people these A.I. models exploit for profit. Without the voices of artists and other creatives, there cannot be a well-informed or comprehensive understanding of what is at stake,” Zhao said.

The “voluntary” nature of these commitments renders these commitments “meaningless,” he added, calling the provisions outlined in the deal “poorly defined goals” which involve technical problems that lack solutions or may be completely insolvable.

“Take the example of ‘watermarking’ A.I.-generated content. There are no robust solutions for watermarking generative context, either text or images, known today. Fragile watermarks are ineffective, and worse yet, convey a false sense of security,” Zhao said. “Robust watermarks are incredibly difficult to build, especially in an adversarial setting such as the proposed application scenarios. How hard will these A.I. companies work at ‘voluntarily’ building these difficult systems?”

Zhao said the Biden administration needs to secure further commitments for transparency APIs that will “evaluate and report on the accuracy of these mechanisms” as well as “real regulation with well-defined, transparent goals that are backed up with plans for testing, enforcement, and if necessary, penalties.”

“The assumption that big tech will do the ‘right’ thing despite the obvious financial disincentives is naïve,” he said.

Crabapple proposed that the Federal Trade Commission should impose “algorithmic disgorgement,” a technical term for the destruction of an algorithm, on any company that trained its models on copyrighted work.

She said the FTC has imposed disgorgement on several companies in the past, including Cambridge Analytica and Weight Watchers, when it was found their algorithms were trained on illegally obtained data.

She also called on the FTC to demand companies only train models on consensually obtained work going forward and “impose penalties for theft of copyrighted work by the companies in the future.”

The European Union’s laws on text and data mining and a prospective law on A.I. are “much clearer” about providing guidance on issues like data harvesting and model transparency, Dryhurst said. He called the ability to opt out of training models the “most plausible demand” artists can make.

“If a common sense opt-out is not standardized, I fear we will start to witness an increasing balkanization of the open web as artists and companies opt to construct walls and moats around their data,” he said. “We have already seen glimpses of this with Reddit and Twitter and I anticipate it will worsen.”

Dryhurst said the EU’s existing and proposed laws are “imperfect” but are the “most practical guidance as to how to address artist concerns” currently in place.

“Recent senate judiciary committee meetings have shown promise in recognizing the value of the ability for creatives to opt-out of A.I. training should they so desire,” Dryhurst said.

“I am personally skeptical that the U.S. will assume a harder policy stance than the EU regarding A.I. training, given the stakes at play. It is crucial to understand there is currently a race taking place between countries to provide the most hospitable conditions for A.I. companies, who stand to play a significant role in the economy moving forward.”

 

More Trending Stories:  

Looking for a Smart Beach Read? Here Are 15 of the Most Gripping New Art-World Books to Crack Open This Summer 

Artists Have Come Forward Claiming Non-Payment From Simon Lee Gallery Following News of Its Financial Insolvency 

Federal Funding Has Incentivized Institutions to Hold on to—and Even Destroy—Native Remains, a New Report Suggests 

Who Was Edward Brezinski? Nobody Really Knows. But a New Documentary About the ’80s Artist’s Failure to Find Fame Could Change That 

Scientists Have Developed the Whitest White Paint Ever Made—So Reflective It Can Cool Surfaces 

Broadway Legend Stephen Sondheim’s Manhattan Townhouse—Where He Wrote Tony-Winning Musicals—Is Selling to the Tune of $7 Million 

A Judge Green-lit a Virginia Museum’s Plans to Melt Down a Confederate Monument, Dismissing a Lawsuit Attempting to Save It 

Artist Stuart Semple Is Releasing a ‘Barbie-Ish,’ Ultra-Fluorescent Pink Paint to Protest Mattel’s Trademark on the Color 

An Australian Photographer Was Disqualified From a Photo Contest After Her Submission Was Mistakenly Deemed A.I.-Generated 

LEAVE A REPLY

Please enter your comment!
Please enter your name here