Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    133
    arrow-down
    14
    ·
    7 months ago

    It’s all so incredibly gross. Using “AI” to undress someone you know is extremely fucked up. Please don’t do that.

      • Nobody@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 months ago

        Behold my meaty, majestic tentacles. This better not awaken anything in me…

    • MonkderDritte@feddit.de
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      11
      ·
      edit-2
      7 months ago

      Same vein as “you should not mentally undress the girl you fancy”. It’s just a support for that. Not that i have used it.

      Don’t just upload someone else’s image without consent, though. That’s even illegal in most of europe.

      • MxM111@kbin.social
        link
        fedilink
        arrow-up
        20
        arrow-down
        4
        ·
        7 months ago

        Why you should not mentally undress the girl you fancy (or not, what difference does it make?)? Where is the harm of it?

    • ???@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      20
      ·
      7 months ago

      Would it be any different if you learn how to sketch or photoshop and do it yourself?

      • Kedly@lemm.ee
        link
        fedilink
        English
        arrow-up
        44
        arrow-down
        6
        ·
        7 months ago

        You say that as if photoshopping someone naked isnt fucking creepy as well.

        • stebo@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          9
          ·
          7 months ago

          Creepy, maybe, but tons of people have done it. As long as they don’t share it, no harm is done.

          • Kedly@lemm.ee
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            2
            ·
            7 months ago

            I dont think that many have dude. Like sure, if you’re talking total number and not percentage, but this planet has so many people you could also claim that tons of people are pedophiles too

            • ???@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              8
              ·
              7 months ago

              Lol you’d be surprised…isn’t this one of those things people would do in private but never admit in public (because of people likr you getting all touchy and creeped out by it)?

              • Kedly@lemm.ee
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                1
                ·
                edit-2
                7 months ago

                You say this like we SHOULDN’T be creeped out that you are digitally undressing someone without their permission

        • ???@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          6
          ·
          7 months ago

          Creepy to you, sure. But let me add this:

          Should it be illegal? No, and good luck enforcing that.

          • Kedly@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            7 months ago

            You’re at least right on the enforcement part, but I dont think the illegality of it should be as hard of a no as you think it is

          • ???@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            5
            ·
            7 months ago

            I think anyone claiming otherwisw would be lying most likely.

          • Kedly@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            edit-2
            7 months ago

            There is a massive difference between thoughts and action. I’m sure a significant portion of us have thought about murdering someone too, does that make actually going through with murder less bad?

            • Drewelite@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              7 months ago

              This is a false equivalency. The correct analogy would be: if I think about murdering someone and then draw a picture of it or make a movie about murdering them, is that wrong?

        • ???@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          10
          ·
          edit-2
          7 months ago

          I am not saying anyone should do it and don’t need some internet stranger to police me thankyouverymuch.

      • KidnappedByKitties@lemm.ee
        link
        fedilink
        English
        arrow-up
        70
        arrow-down
        13
        ·
        7 months ago

        Consent.

        You might be fine with having erotic materials made of your likeness, and maybe even of your partners, parents, and children. But shouldn’t they have right not to be objectified as wank material?

        I partly agree with you though, it’s interesting that making an image is so much more troubling than having a fantasy of them. My thinking is that it is external, real, and thus more permanent even if it wouldn’t be saved, lost, hacked, sold, used for defamation and/or just shared.

        • InternetPerson@lemmings.world
          link
          fedilink
          English
          arrow-up
          34
          arrow-down
          13
          ·
          7 months ago

          To add to this:

          Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

          Wouldn’t that feel wrong? Wouldn’t you feel violated? It’s the same with such AI porn tools. You serve to satisfy the sexual desires of someone else and you are given no choice. Whether you want it or not, you are becoming part of their act. Becoming an unwilling participant in such a way can feel similarly violating.

          They are painting and using a picture of you, which is not as you would like to represent yourself. You don’t have control over this and thus, feel violated.

          This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated. Becoming unwillingly, unasked a participant, either active or passive, in the sexual act of someone else and having no or not much control over it, feels wrong and violating for a lot of people.
          In principle that even shares some similarities to rape.

          There are countries where you can’t just take pictures of someone without asking them beforehand. Also there are certain rules on how such a picture can be used. Those countries acknowledge and protect the individual’s right to their image.

          • scarilog@lemmy.world
            link
            fedilink
            English
            arrow-up
            22
            arrow-down
            5
            ·
            7 months ago

            Just to play devils advocate here, in both of these scenarios:

            Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.

            This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated.

            The person has the knowledge that this is going on. In he situation with AI nudes, the actual person may never find out.

            Again, not to defend this at all, I think it’s creepy af. But I don’t think your arguments were particularly strong in supporting the AI nudes issue.

            • CleoTheWizard@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 months ago

              In every chat I find about this, I see people railing against AI tools like this but I have yet to hear an argument that makes much sense to me about it. I don’t care much either way but I want a grounded position.

              I care about harms to people and in general, people should be free to do what they want until it begins harming someone. And then we get to have a nuanced conversation about it.

              I’ve come up with a hypothetical. Let’s say that you write naughty stuff about someone in your diary. The diary is kept in a secure place and in private. Then, a burglar breaks in and steals your diary and mails that page to whomever you wrote it about. Are you, the writer, in the wrong?

              My argument would be no. You are expressing a desire in private and only through the malice of someone else was the harm done. And no, being “creepy” isn’t an argument either. The consent thing I can maybe see but again do you have a right not to be fantasized about? Not to be written about in private?

              I’m interested in people’s thoughts because this argument bugs me not to have a good answer for.

              • Resonosity@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                6 months ago

                Yeah it’s an interesting problem.

                If we go down the path of ideas in the mind and the representations we create and visualize in our mind’s eye, to forbid people from conceiving of others sexually means there really is no justification for conceiving of people generally.

                If we try to seek for a justification, where is that line drawn? What is sexual, and what is general? How do we enforce this, or at least how do we catch people in the act and shame them into stopping their behavior, especially if we don’t possess the capability of telepathy?

                What is harm? Is it purely physical, or also psychological? Is there a degree of harm that should be allowed, or that is inescapable despite our best intentions?

                The angle that you point out regarding writing things down about people in private can also go different ways. I write things down about my friends because my memory sucks sometimes and I like to keep info in my back pocket for when birthdays, holidays, or special occasions come. What if I collected information about people that I don’t know? What if I studied academics who died in the past to learn about their lives, like Ben Franklin? What if I investigated my neighbors by pointing cameras at their houses, or installing network sniffers or other devices to try to collect information on them? Does the degree of familiarity with those people I collect information about matter, or is the act wrong in and of itself? And do my intentions justify my actions, or do the consequences of said actions justify them?

                Obviously I think it’s a good thing that we as a society try to discourage collecting information on people who don’t want that information collected, but there is a portion of our society specifically allowed to do this: the state. What makes their status deserving of this power? Can this power be used for ill and good purposes? Is there a level of cross collection that can promote trust and collaboration between the state and its public, or even amongst the public itself? I would say that there is a level where if someone or some group knows enough about me, it gets creepy.

                Anyways, lots of questions and no real answers! I’d be interested in learning more about this subject, and I apologize if I steered the convo away from sexual harassment and violation. Consent extends to all parts of our lives, but sexual consent does seem to be a bigger problem given the evidence of this post. Looking forward to learning more!

                • CleoTheWizard@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  6 months ago

                  I think we’ve just stumbled on an issue where the rubber meets the road as far as our philosophies about privacy and consent. I view consent as important mostly in areas that pertain to bodily autonomy right? So we give people the rights to use our likeness for profit or promotion or distribution. And what we’re giving people is a mental permission slip to utilize the idea of the body or the body itself for specific purposes.

                  However, I don’t think that these things really pertain to private matters. Because the consent issue only applies when there are potential effects on the other person. Like if I talk about celebrities and say that imagining a celebrity sexually does no damage because you don’t know them, I think most people would agree. And so if what we care about is harm, there is no potential for harm.

                  With surveillance matters, the consent does matter because we view breaching privacy as potential harm. The reason it doesn’t apply to AI nudes is that privacy is not being breached. The photos aren’t real. So it’s just a fantasy of a breach of privacy.

                  So for instance if you do know the person and involve them sexually without their consent, that’s blatantly wrong. But if you imagine them, that doesn’t involve them at all. Is it wrong to create material imaginations of someone sexually? I’d argue it’s only wrong if there is potential for harm and since the tech is already here, I actually view that potential for harm as decreasing in a way. The same is true nonsexually. Is it wrong to deepfake friends into viral videos and post them on twitter? Can be. Depends. But do it in private? I don’t see an issue.

                  The problem I see is the public stuff. People sharing it. And it’s already too late to stop most of the private stuff. Instead we should focus on stopping AI porn from being shared and posted and create higher punishments for ANYONE who does so. The impact of fake nudes and real nudes is very similar, so just take them similarly seriously.

              • KidnappedByKitties@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 months ago

                What I find interesting is that for me personally, writing the fantasy down (rather than referring to it) is against the norm, a.k.a. weird, but not wrong.

                Painting a painting of it is weird and iffy, hanging it in your home is not ok.

                It’s strange how it changes along that progression, but I can’t rightly say why.

            • InternetPerson@lemmings.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              The person has the knowledge that this is going on.

              Not necessarily, no. It could be that they might just think they’ve misplaced their socks. If you’ve lived in an apartment building with shared laundry spaces, it’s not so uncommon to loose some minor parts of clothing. But just because they don’t get to know about it, it’s not less wrong or should be less illegal.

              In he situation with AI nudes, the actual person may never find out.

              Also in connection with my remarks before:
              A lot of our laws also apply even if no one is knowingly damaged (yet). (May of course depend on the legislation of wherever you live.)
              Already intending to commit a crime can sometimes be reason enough to bring someone to court.
              We can argue how much sense that makes of course, but at the current state, we, as a society, decided that doing certain things should be illegal, even if the damage has not manifested yet. And I see many good points to handle it that way with such AI porn tools as well.

          • devfuuu@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            11
            ·
            7 months ago

            Traumatizing rape victims with non consentual imagery of them naked and doing sexual things with others and sharing it is totally not going yo fuck up the society even more and lead to a bunch of suicides! /s

            Ai is the future. The future is dark.

            • Kedly@lemm.ee
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              2
              ·
              7 months ago

              tbf, the past and present are pretty dark as well

            • InternetPerson@lemmings.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              6 months ago

              That’s why we need strong legislation. Most countries wordlwide are missing crucial time frames for making such laws. At least some are catching up, like the EU did recently with their first AI act.

        • Thorny_Insight@lemm.ee
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          12
          ·
          7 months ago

          it is external, real, and thus more permanent

          Though just like your thoughts, the AI is imagining the nude parts aswell because it doesn’t actually know what they look like. So it’s not actually a nude picture of the person. It’s that person’s face on a entirely fictional body.

          • KidnappedByKitties@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            7 months ago

            But the issue is not with the AI tool, it’s with the human wielding it for their own purposes which we find questionable.

      • Belastend@lemmy.world
        link
        fedilink
        English
        arrow-up
        34
        arrow-down
        5
        ·
        7 months ago

        An exfriend of mine Photoshopped nudes of another friend. For private consumption. But then someone found that folder. And suddenly someones has to live with the thought that these nudes, created without their consent, were used as spank bank material. Its pretty gross and it ended the friendship between the two.

        • Scrollone@feddit.it
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          8
          ·
          edit-2
          7 months ago

          You can still be wank material with just your Facebook pictures.

          Nobody can stop anybody from wanking on your images, AI or not.

          Related Louis CK

          • Belastend@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            2
            ·
            7 months ago

            Thats already weird enough, but there is a meaningful difference between nude pictures and clothed pictures. If you wanna whack one to my fb pics of me looking at a horse, ok, weird. Dont fucking create actual nude pictures of me.

        • ???@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          7
          ·
          7 months ago

          And if you have to say that, you’re already sounding like some judgy jerk.

        • MxM111@kbin.social
          link
          fedilink
          arrow-up
          14
          arrow-down
          14
          ·
          7 months ago

          The fact that you do not even ask such questions, shows that you are narrow minded. Such mentality leads to people thinking that “homosexuality is bad” and never even try to ask why, and never having chance of changing their mind.

          • ???@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            3
            ·
            edit-2
            7 months ago

            They cannot articulate why. Some people just get shocked at “shocking” stuff… maybe some societal reaction.

            I do not see any issue in using this for personal comsumption. Yes, I am a woman. And yes people can have my fucking AI generated nudes as long as they never publish it online and never tell me about it.

            The problem with these apps is that they enable people to make these at large and leave them to publish them freely wherever. This is where the dabger lies. Not in people jerking off to a picture of my fucking cunt alone in a bedroom.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        10
        ·
        edit-2
        7 months ago

        It’s creepy and can lead to obsession, which can lead to actual harm for the individual.

        I don’t think it should be illegal, but it is creepy and you shouldn’t do it. Also, sharing those AI images/videos could be illegal, depending on how they’re represented (e.g. it could constitute libel or fraud).

        • InternetPerson@lemmings.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          10
          ·
          7 months ago

          I disagree. I think it should be illegal. (And stay that way in countries where it’s already illegal.) For several reasons. For example, you should have control over what happens with your images. Also, it feels violating to become unwillingly and unasked part of the sexual act of someone else.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            5
            ·
            7 months ago

            That sounds problematic though. If someone takes a picture and you’re in it, how do they get your consent to distribute that picture? Or are they obligated to cut out everyone but those who consent? What does that mean for news orgs?

            That seems unnecessarily restrictive on the individual.

            At least in the US (and probably lots of other places), any pictures taken where there isn’t a reasonable expectation of privacy (e.g. in public) are subject to fair use. This generally means I can use it for personal use pretty much unrestricted, and I can use it publicly in a limited capacity (e.g. with proper attribution and not misrepresented).

            Yes, it’s creepy and you’re justified in feeling violated if you find out about it, but that doesn’t mean it should be illegal unless you’re actually harmed. And that barrier is pretty high to protect peoples’ rights to fair use. Without fair use, life would suck a lot more than someone doing creepy things in their own home with pictures of you.

            So yeah, don’t do creepy things with other pictures of other people, that’s just common courtesy. But I don’t think it should be illegal, because the implications of the laws needed to get there are worse than the creepy behavior of a small minority of people.

            • Couldbealeotard@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              2
              ·
              edit-2
              7 months ago

              Can you provide an example of when a photo has been taken that breaches the expectation of privacy that has been published under fair use? The only reason I could think that would work is if it’s in the public interest, which would never really apply to AI/deepfake nudes of unsuspecting victims.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                edit-2
                7 months ago

                I’m not really sure how to answer that. Fair use is a legal term that limits the “expectation of privacy” (among other things), so by definition, if a court finds it to be fair use, it has also found that it’s not a breach of the reasonable expectation of privacy legal standard. At least that’s my understanding of the law.

                So my best effort here is tabloids. They don’t serve the public interest (they serve the interested public), and they violate what I consider a reasonable expectation of privacy standard, with my subjective interpretation of fair use. But I disagree with the courts quite a bit, so I’m not a reliable standard to go by, apparently.

                • Couldbealeotard@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  7 months ago

                  Fair use laws relate to intellectual property, privacy laws relate to an expectation of privacy.

                  I’m asking when has fair use successfully defended a breach of privacy.

                  Tabloids sometimes do breach privacy laws, and they get fined for it.

                  • sugar_in_your_tea@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    7 months ago

                    Right, they’re orthogonal concepts. If something is protected by fair use laws, then privacy laws don’t apply. If privacy laws apply, then it’s not fair use.

                    The proper discussion in this area is around libel law. That’s where tabloids are usually sued, not for fair use or “privacy violations.” For a libel suit to succeed, the plaintiff must prove that the defendant made false statements that caused actual harm to the plaintiff’s reputation. There are a bunch of lawsuits going on right now examining deep fakes and similarly allegedly libelous use of an individuals likeness. For a specific example, look at the Taylor Swift lawsuit around deep fake porn.

                    But the crux of the matter is that you ain’t have a right to your likeness, generally speaking, and fair use laws protects creepy use of legally acquired representations of your likeness.

      • misc@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        12
        ·
        edit-2
        7 months ago

        Would you like if someone were to make and wank to these pictures of your kids, wife or parents ? The fact that you have to ask speaks much about you tho.

        • foggenbooty@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          2
          ·
          7 months ago

          There are plenty of things I might not like that aren’t illegal.

          I’m interested in thr thought experiment this has brought up, but I don’t want us to get caught in a reactionary fervor because of AI.

          AI will make this easier to do, but people have been clipping magazines and celebrities have had photoshops fakes created since both mediums existed. This isn’t new, but it is being commoditized.

          My take is that these pictures shouldn’t be illegal to own or create, but they should be illegal to profit off of and distribute, meaning these tools specifically designed and marketed for it would be banned. If someone wants to tinker at home with their computer, yoipl never be able to ban that, and you’ll never be able to ban sexual fantasy.

          • misc@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            6
            ·
            7 months ago

            I think it should be illigal even photoshops of celebs they too are human and have emotions.

            • CaptainEffort@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              7 months ago

              I get that it’s creepy but that’s a dark path you want to walk down. Think about how that would have to be enforced.

              • misc@lemmy.sdf.org
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                7 months ago

                I would say like how cp is enforced as some of these ai fakes might even involve kids .

                • CaptainEffort@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  7 months ago

                  That’s a great example though, because truthfully digital cp is incredibly difficult to enforce, and some of the laws that have been proposed to make it easier have been incredibly controversial due to how violating they are to people’s privacy.

        • devfuuu@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          4
          ·
          edit-2
          7 months ago

          The fact that people don’t realize how these things can be used for bad and weaponized is insane. I mean, it shows they clearly are not part of the vulnerable group of people and their privilege of never having dealt with it.

          The future is amazing! Everyone with apps going to the parks and making some kids nude. Or bullying which totally doesn’t happen in fucked up ways with all the power of the internet already.