THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • j4k3@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    4 months ago

    Things have already been moving towards nobody models. I think this will eventually have the consequence of nobodies becoming the new somebodies as this will result in a lot of very well developed nobodies and move the community into furthering their development instead of the deepfake stuff. You’ll eventually be watching Hollywood quality feature films full of nobodies. There is some malicious potential with deep fakes, but the vast majority are simply people learning the tools. This will alter that learning target and the memes.

    • aesthelete@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      4 months ago

      You’ll eventually be watching Hollywood quality feature films full of nobodies.

      With the content on modern streaming services, I’ve been doing this for a while already.

  • dohpaz42@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    To improve rights to relief for individuals affected by non-consensual activities involving intimate digital forgeries, and for other purposes.

    Congress finds that:

    (1) Digital forgeries, often called deepfakes, are synthetic images and videos that look realistic. The technology to create digital forgeries is now ubiquitous and easy to use. Hundreds of apps are available that can quickly generate digital forgeries without the need for any technical expertise.

    (2) Digital forgeries can be wholly fictitious but can also manipulate images of real people to depict sexually intimate conduct that did not occur. For example, some digital forgeries will paste the face of an individual onto the body of a real or fictitious individual who is nude or who is engaging in sexual activity. Another example is a photograph of an individual that is manipulated to digitally remove the clothing of the individual so that the person appears to be nude.

    “(3) DIGITAL FORGERY.—

    “(A) IN GENERAL.—The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.

    Source

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    distribute, or receive the deepfake pornography

    Does this make deepfake pornography more restricted than real sexual images either created or publicly released without consent?

    • drislands@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      I think so. In a way, it makes sense – a lot of people are of the (shitty) opinion that if you take lewd pictures of yourself, it’s your fault if they’re disseminated. With lewd deepfakes, there’s less opportunity for victim blaming, and therefore wider support.

  • MagicShel@programming.dev
    link
    fedilink
    arrow-up
    7
    arrow-down
    3
    ·
    4 months ago

    Is this just “AI porn bill” because that’s the most common way of doing it these days? I should expect the product is what’s being sanctioned and not the method.

    • dohpaz42@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      4 months ago

      Not just AI…

      …any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means….

      This probably also covers using MS Paint.

        • niucllos@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          4 months ago

          As long as it can’t be mistaken for the actual person it moves the stigma from them doing weird things to the artist doing weird things

      • youngalfred@lemm.ee
        link
        fedilink
        arrow-up
        9
        ·
        4 months ago

        From the text of the bill:

        The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.

        • ArbitraryValue@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 months ago

          Thanks. So photorealistic paintings are still legal, although I suppose they’re not a big problem in practice. It’s still weird that the method of creation matters, although “any other technological means” is pretty broad. Are paintbrushes a technology? Does using a digital camera to photograph a painting count as creating a visual depiction?

          I’m vaguely worried about the first-amendment implications.

          • youngalfred@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            I think it comes down to the last part - indistinguishable by a reasonable person as an authentic visual depiction. That’ll be up to courts to decide, but I think a painting would be pretty obviously not an authentic visual depiction.

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 months ago

        This bill targets digital (computer made or altered) forgeries. Not hand-drawn sketches.

  • youngalfred@lemm.ee
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    4 months ago

    Question from an outsider:
    Do all bills in the states have to have a fancy acronym?
    It looks like the senate is the first step, is that right? Next is the house? It’s the opposite where I am.

    • Irremarkable@fedia.io
      link
      fedilink
      arrow-up
      6
      ·
      4 months ago

      Not all bills do, but the majority of big ones you hear about do. It’s simply a marketing thing

      And correct, it’ll move to the House, where if it passes it will move to the president’s desk. Considering it was unanimous in the Senate, I can’t see it having any issues in the House.

  • finley@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    4 months ago

    Ok, fist of all, politicians need to stop it with these acronyms for every law they want to pass. It’s getting ridiculous. Just give the damned law a regular-ass name. It doesn’t have to be all special and catchy-sounding damn.

    Second, I’m really surprised to hear of anything passing the senate unanimously, other than a bill expressing the love of silly acronyms. And weak campaign finance laws.

    Anyway, I’m glad that at least something is being done to address this, but I just know someone in the House is gonna fuck this up.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      4 months ago

      I looked it up: https://www.congress.gov/bill/118th-congress/senate-bill/3696/all-actions

      It passed by unanimous voice vote. That is, the president of the Senate (pro tempore, usually Patty Murray) asked for all in favor, and at least one person said “yea” and when asked for all opposed, nobody said “nay”. There wasn’t a roll call vote, so we don’t know how many people (or who) actually voted for it.

      Edit: it was probably on C-Span, so you can probably find a recording and get an idea of how many people were there, and how many yeas you hear, if you’re so inclined.

  • Asifall@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Not convinced on this one

    It seems like the bill is being pitched as protecting women who have fake nudes passed around their school but the text of the bill seems more aimed at the Taylor swift case.

    1 The bill only applies where there is an “intent to distribute”

    2 The bill talks about damages being calculated based on the profit of the defendant

    The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law. That seems at odds with the 1st amendment.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      4 months ago

      The bill only applies where there is an “intent to distribute”

      That’s a predicate for any law bound to the Commerce Clause. You need to demonstrate the regulation is being applied to interstate traffic. Anything else would be limited to state/municipal regulations.

      The bill talks about damages being calculated based on the profit of the defendant

      That’s arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn’t going to blink at a handful of $1000 judgements.

      The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law.

      A revenge-porn law that can be evaded by asserting “This isn’t Taylor Swift, its Tay Swiff and any resemblance of an existing celebrity is purely coincidental” would be toothless. We already apply these rules for traditional animated assets. You’d be liable for producing an animated short staring “Definitely Not Mickey Mouse” under the same reasoning.

      This doesn’t prevent you from creating a unique art asset. And certainly there’s a superabundance of original pornographic art models and porn models generated with the consent of the living model. The hitch here is obvious, though. You’re presumed to own your own likeness.

      My biggest complaint is that it only seems to apply to pornography. And I suspect we’ll see people challenge the application of the law by producing “parody” porn or “news commentary” porn. What the SCOTUS does with that remains to be seen.

      • Asifall@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        That’s arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn’t going to blink at a handful of $1000 judgements.

        No argument there but it reinforces my point that this law is written for Taylor swift and not a random high schooler.

        You’d be liable for producing an animated short staring “Definitely Not Mickey Mouse” under the same reasoning.

        Except that there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment. You can see the parody exception used in many episodes of south park for example and even specifically used to depict Mickey Mouse. Either this bill allows for those types of uses in which case it’s toothless anyway or it’s much more restrictive to speech than existing copyright law.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          written for Taylor swift and not a random high schooler.

          In a sane world, class action lawsuits would balance these scales.

          there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment

          Why would revenge porn constitute fair use? This seems more akin to slander.

          • Asifall@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            You keep referring to this as revenge porn which to me is a case where someone spreads nudes around as a way to punish their current or former partner. You could use AI to generate material to use as revenge porn, but I bet most AI nudes are not that.

            Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos. Clearly that would be protected speech. If you generate the same image with generative AI though then suddenly it’s illegal even if you clearly label it as being a parody. That’s the concern. Moreover, the slander/libel angle doesn’t make sense if you include a warning that the image is generated, as you are not making a false statement.

            To sum up why I think this bill is kinda weird and likely to be ineffective, it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy as long as I don’t present it as a real video. If I generate the same video but my neighbor’s dick is hanging out, straight to jail. It’s not consistent.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              4 months ago

              where someone spreads nudes around as a way to punish their current or former partner

              I would consider, as an example, a student who created a vulgar AI porn display of another student or teacher out of some sense of spite an example of “revenge porn”. Same with a coworker or boss trying to humiliate someone at the office.

              Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos.

              That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

              it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy

              If you used it to slander your neighbor, it would not be legal.

              • Asifall@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                4 months ago

                That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

                So you think it should be illegal?

                If you used it to slander your neighbor, it would not be legal.

                You’re entirely ignoring my point, I’m not trying to pass the video off as real therefore it’s not slander.

                • UnderpantsWeevil@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  arrow-down
                  1
                  ·
                  4 months ago

                  So you think it should be illegal?

                  I think it’s an example of partisan language that ends up being blandly homophobic.

                  You’re entirely ignoring my point

                  Why would putting up a giant sign reading “My neighbor murders dogs for fun” be a tort but a mural to the same effect be protected?

  • NauticalNoodle@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Am I the only one that still gets uncomfortable every time the government tries to regulate advanced technology?

    It’s not a Libertarian thing to me as much as it’s a ‘politicians don’t understand technology thing.’

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Arguing against laws that prohibit sexual exploitation with high tech tools, because of the nature of technology, would be like arguing against laws that prohibit rape because of the nature of human sexuality.

      The “it still is going to happen” argument doesn’t matter, because the point of the law isn’t to eliminate something 100%, it is to create consequences for those who continue to do what the law prohibits.

      It’s not some slippery slope either, it is extremely easy not to make involuntary pornography of other people.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        it is extremely easy not to make involuntary pornography of other people.

        Eh. The term is ill-defined, so I can see some ultra-orthodox right-wing judge trying to argue that - say - jokes about JD Vance fucking a couch constitute violations of the revenge porn law. I can see some baroque interpretation by Scalia used to prohibit all forms of digitally transmitted pornography. I can also see some asshole trying to claim baby pictures on Facebook leave the company or even the parent liable for child pornography. Etc, etc.

        But a lot of this boils down to vindictive and despicable politicians trying to inflict harm on political opponents by any means necessary. The notion that we can’t have any kind of technology regulation because bad politicians and sadistic cops exist leaves us ceding the entire legislative process to the conservatives who we know are going to abuse the law.

        We shouldn’t be afraid to do the right thing now on the grounds that someone else might do the wrong thing tomorrow.