Adobe Proposes Anti-Impersonation Law Based On AI, Expands Access To Firefly

Adobe Proposes Anti-Impersonation Law Based On AI, Expands Access To Firefly

by  @lauriesullivan, July 12, 2023

Adobe Proposes Anti-Impersonation Law Based On AI, Expands Access To Firefly | DeviceDaily.com

Adobe Chief Trust Officer Dana Rao participated in the U.S. Senate Judiciary Subcommittee Hearing on artificial intelligence (AI) and intellectual property and copyright on Wednesday, suggesting that an anti-impersonation right be made a federal requirement.  

“From a copyright perspective there are two core questions: Is the output image a copyright infringement of an image that was used to train the AI model, and is using a third-party image to train an AI model permissible under fair use?” Roa asked during his testimony.

The anti-impersonation right would apply to everyone, Roa said. An AI model is trained on an artist and it creates content exactly like that artist, but the artist may never know.

“You want to focus on people who are intentionally impersonating someone to benefit from commercial work,” he said, noting that Adobe takes great measures to protect copyrights with its AI system Firefly created to generate images from scratch. Firefly is designed to integrate with Adobe’s Creative Cloud applications, such as Photoshop, Illustrator, and InDesign. 

Adobe today announced that Firefly has expanded globally and now supports more than 100 languages. Since the March launch of Firefly, users have generated more than 1 billion images.

And as generative AI becomes more prevalent, consumers want to know whether content was created or edited by AI. 

Adobe also provided an update today on the Content Authenticity Initiative, which has more than 1,500 members such as Omnicom, Universal Music Group and Stability AI.

Artists have been a victim of deepfakes, defined as fake videos, images, or audio recordings that have been edited using algorithms and machine learning to replace a person in the original with someone else. 

Based on the sophistication of the technology, there should be rules that govern an improper use of voice and vision likeness, the panel agreed, including Karla Ortiz, a concept artist, working with studios such as Paragon Studios, and Ubisoft.

Ortiz said “I love every step of process of being an artist,” and the industry she works in wants to make it clear “that we do not want to exploit each other. The models compete in our own market. It’s not something hypothetical. It’s happening now.”

Ortiz also said tech companies must determine a way to identify the artist in the dataset, because otherwise, how would one know whom to compensate? This, she said, should be the foundation of the law, for artists to gain consent and compensation.

Adobe Chief Trust Officer Dana Rao participated in the U.S. Senate Judiciary Subcommittee Hearing on artificial intelligence (AI) and intellectual property and copyright on Wednesday.
 

(7)