Temitope is a writer with more than four years of experience writing across various niches. He has a special interest in the fintech and blockchain spaces and enjoy writing articles in those areas. He holds bachelor's and master's degrees in linguistics. When not writing, he trades forex and plays video games.
Kalra expressed deep concern about the unauthorized use of AI to create digital replicas of actors and artists.
California legislator Ash Kalra has introduced Assembly Bill 459 (AB 459) with the primary aim of safeguarding actors, artists, and entertainers within the entertainment industry from potential artificial intelligence (AI) risks, particularly generative AI.
The bill’s central objective is to mandate the inclusion of consent clauses in employment contracts within the entertainment industry when considering the use of digital replicas or AI-generated content featuring artists or their likenesses.
Kalra, expressing deep concern about the unauthorized use of AI to create digital replicas of actors and artists, argues that generative AI poses a significant threat to professionals in the entertainment sector. Consequently, the bill proposes the prohibition of generative AI use in the entertainment industry unless a mutually agreed-upon contract is established between the parties involved. Essentially, actors and artists should have a say in how their digital selves or digital replicas are employed. She stated:
“Mandating informed consent and representation will help ensure workers are not unknowingly at risk of losing the right to their digital self, and with it, their careers and livelihoods.”
The legislative process for AB 459 in California involves the formation of a committee tasked with reviewing and discussing potential amendments to the bill. Subsequently, the bill is expected to be presented to the legislative chamber for a vote.
Understanding the Concerns Surrounding AI in Entertainment
Failure to regulate the use of actors’ digital selves could have profound consequences, potentially jeopardizing their livelihoods and careers. Some characters could be manipulated in ways that compromise the integrity of their professional image. These considerations underscore the significance of the bill in protecting actors from losing control over their digital likenesses or being portrayed in ways they do not endorse.
Support for the bill extends to industry organizations such as the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA), the labor union representing US media professionals. They also advocate for consent-based laws to safeguard actors’ digital images. Duncan Crabtree-Ireland, the national executive director of the union, expressed apprehensions about AI copycats contributing to abusive and exploitative practices. He stressed the importance of legislation in protecting actors from such practices:
“We see protection against the unjust transfers of these rights to be an imperative against potential abusive or exploitative practices. We are deeply concerned by the proliferation of AI-created audio and video content without full consent, and this legislation is an important step to ending these dangerous practices.”
Previously, the Screen Actors Guild engaged in strike actions in Hollywood due to AI-related issues and demands for enhanced protections. They advocate for higher royalties concerning the use of their digital images and voice likenesses while also expressing concerns about AI’s unauthorized utilization of actors’ identities and voices.