THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • JovialMicrobial@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    4 months ago

    Look, I’m a professional artist. The general rule is you have to change something 15% to 30%(depending on location) for it to not come into violation of copyright laws. That’s why you see satirical depictions of brands in cartoons and such.

    This new law has to take into consideration art laws, defamation laws, revenge porn laws, slander laws, and the right for a person to own their likeness.

    It is absolutely necessary to reign this in before serious harm is done to someone. The point of writing a law to address this specific issue is because for the law actually be effective, it must be written to address the specific problem this technology presents. I listed the other laws to show its consistent with ones we already have. There’s nothing wrong with adding in another to protect people.

    As for the unskilled part, the point of that is a skilled person creating deepfake porn by hand, frame by frame should get in as much trouble as an unskilled person using ai. The AI is just going to make it so more unethical people are making this crap…so more if it will exist. That’s a problem that needs addressing.

    You have a nice day now.