Political ‘deepfake’ videos banned in California before elections - San Francisco Chronicle

SACRAMENTO — Ahead of what is expected to be a contentious election next year, California has made it illegal to distribute deceptively edited videos and audio clips intended to damage a politician’s reputation or deceive someone into voting for or against a candidate.

On Thursday, Gov. Gavin Newsom signed without comment AB730, which prohibits the distribution of manipulated clips known as “deepfakes” within 60 days of an election. It gives candidates the right to sue to stop their spread and to seek financial damages, though there are no criminal penalties. The law, which includes exceptions for media organizations and exempts images and audio that disclose they have been manipulated, will expire in 2023.

Assemblyman Marc Berman, D-Palo Alto, introduced the legislation this summer out of concern that deepfake technology — which uses artificial intelligence to create images that can make someone appear to say or do something they did not — could be deployed on a mass scale to influence the 2020 presidential election.

He said the law would also cover more crudely edited clips intended to falsely portray a candidate, such as the viral Facebook video of a speech this year by House Speaker Nancy Pelosi, D-San Francisco, that was slowed down to make her seem drunk or otherwise impaired. Pelosi got into a feud with Facebook after the company refused to take down the video.

“Voters have a right to know when video, audio and images that they are being shown to try to influence their vote in an upcoming election have been manipulated and do not represent reality,” Berman said in a statement. “In the face of total inaction at the federal level, California must step up to protect our more than 20 million voters.”

Newsom also signed another Berman bill, AB602, which gives Californians the right to sue someone who creates a deepfake of them in a sexually explicit situation without their permission.