The US Senate has unanimously passed a bill aimed at helping and protecting victims of nonconsensual deepfake porn, providing a 10-year statute of limitations.
Deepfake porn has been a growing problem, especially with the rise of AI. The technology has made deepfakes so convincing that it can be almost impossible to tell if a deepfake is real or not. As a result, victims often deal with emotional damage and humiliation, not to mention damage to the victim’s reputation if the deepfake is widely circulated without people realizing it’s fake.
According to Engadget, the US Senate aims to tackle the problem with the Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act). The bill would give victims 10 years to sue for damages.
The DEFIANCE Act would give victims the ability to sue not only those who create nonconsensual deepfake porn, but also those who distribute and possess such material. Initial damages could be up to $150,000, with that figure going up to $250,000 if the incident also includes stalking, harassment, or attempted sexual assault.
Majority Leader Chuck Schumer praised the passage of the bill in his remarks in his floor remarks:
As we know, AI plays a bigger role in our lives than ever before, and while it has many benefits, it’s also easier than ever to create sexually explicit deep fakes without a person’s consent. It is a horrible attack on someone’s privacy and dignity to have these fake images of them circulating online without recourse.
And this isn’t just some fringe issue that happens to only a few people—it’s a widespread problem. These types of malicious and hurtful pictures can destroy lives. Nobody is immune, not even celebrities like Taylor Swift or Megan Thee Stallion. It’s a grotesque practice and victims of these deep fakes deserve justice. And this is one of the examples of the AI guardrails I often talk about. AI is a remarkable technology that can spur incredible innovation, but we must pass guardrails to prevent its worst abuses from causing people grave harm.
By passing this bill, we are telling victims of explicit nonconsensual deepfakes that we hear them and we are taking action.
The House will likely vote on a sister bill. If it passes the house, it will go to President Biden to be signed into law.