Fomer senior editor Caleb Long will be shedding light on proposed legislation aimed at limiting the harmful effects artificial intelligence and deepfakes can have on consumers and notable figures.

The recent and rapid development in AI technology has come with its fair share of issues. One of those issues is the use of “deepfake” technology. Lately more and more celebrities have fallen victim to the use of AI by third parties to imitate their voice, likeness, or both in uncanny realism, often for commercial gain, without the celebrities’ consent. Prevalent instances of this problem include the release of a song that went viral featuring accurate imitations of Drake and The Weekend’s voices, a dental plan ad featuring a deepfake of Tom Hanks’ likeness and voice imitated. Taylor Swift, Selena Gomez, Tom Brady, Rihana, and the late Robin Williams are others who have been imitated by AI, none of whom gave consent.

At least some members of congress are concerned by these events. A bipartisan group of senators introduced a legislative draft last week which proposes laws that impose liability on those who imitate someone’s likeness using AI without that person’s authorization.

This issue is both relevant and interesting because it lies squarely in the public eye and affects the public in a very perceivable manner. The use of AI to make it seem like our favorite celebrities are saying and doing things they have not actually said or done seems like a neat concept, and AI has positive uses. But the reality is, the violation of people’s right of publicity and the absolutely unhinged potential for fraud that is implicated by this tech is not funny or cool. It’s concerning. And legislation is necessary. There is state legislature protecting the publicity rights of individuals, but there are gaps in it that deepfakes can slip through depending on the context and purpose of the imitation.

I would love to see Congress ban the practice of unauthorized AI imitation, regardless of whether the purpose is commercial, and I suspect it will eventually.