Emerging Tech

California is cracking down on deepfakes for politics and porn

California is cracking down on deepfakes in both porn and politics. California Governor Gavin Newsom recently signed two bills: One (AB 730) which makes it illegal to create or distribute altered video, audio or still photography of candidates, and the other (AB 602) which lets California residents take legal action against people who create pornographic material non-consensually using their likeness.

A combination of “deep learning” and “fake,” deepfake technology lets anyone create altered materials in which photographic elements are convincingly superimposed onto other pictures. There is also a growing number of attempts to use related technologies to create fake audio that sounds as if it was spoken by a real person, such as a politician or celebrity. Although all of these tools can be used for innocent purposes (for instance, placing Sylvester Stallone’s face onto Arnold Schwarzenegger’s body in Terminator 2), people have rightly expressed concern about the way this technology can be used maliciously. The uses covered by the two California bills could do everything from damage reputations to, in the case of politically oriented deepfakes, potentially sway elections.

“In the context of elections, the ability to attribute speech or conduct to a candidate that is false — that never happened — makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters,” California assemblyman Marc Berman said in a statement. Berman authored the political deepfake bill AB 730.

The political bill covers only deepfakes which are distributed 60 days before an election. It also does not cover deepfakes which are obvious satire, since these are covered by free speech laws. Last month, Texas made waves as the first state to sign a law that makes it a misdemeanor to create and share distorted videos of politicians one month before elections.

Pornography is another big problem when it comes to deepfakes. A study by cybersecurity firm Deeptrace claims that 96% of 14,678 deepfake videos it identified online were pornographic. All of these used the likenesses of women, such as actresses and musicians.

Unfortunately, simply signing this into law isn’t going to guarantee the end of deepfakes created by bad actors. For that, a combination of enforcement of laws and more sophisticated tools for spotting deepfakes will be needed. But this is a first step that will hopefully help avoid an incredibly damaging use of sophisticated A.I. technology.

Editors’ Recommendations

Let’s block ads! (Why?)

Emerging Tech | Digital Trends

Leave a Reply

Your email address will not be published. Required fields are marked *