A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • calypsopub@lemmy.world
    link
    fedilink
    arrow-up
    56
    arrow-down
    25
    ·
    11 months ago

    So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

    I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

    • WoahWoah@lemmy.world
      link
      fedilink
      arrow-up
      67
      arrow-down
      3
      ·
      11 months ago

      That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

      This isn’t like high school when you went to high school.

      Agreed on your last paragraph.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        23
        arrow-down
        5
        ·
        11 months ago

        Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

        That’s the silver lining of this entire ordeal.

        Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

        • finestnothing@lemmy.world
          link
          fedilink
          arrow-up
          17
          arrow-down
          2
          ·
          11 months ago

          That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

          Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

          • derpgon@programming.dev
            link
            fedilink
            arrow-up
            3
            arrow-down
            4
            ·
            11 months ago

            The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

            Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

            • Llewellyn@lemm.ee
              link
              fedilink
              arrow-up
              5
              ·
              11 months ago

              You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don’t know, piece of clothes or something.

            • WoahWoah@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              11 months ago

              Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

              HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

              And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

              The entire thing is damaging and ugly.

              • derpgon@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                If you are already an employee, then they, will want to keep you and look into the matter.

                If you are not an employee yet - is HR really looking up porn of everyone?

                  • derpgon@programming.dev
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    11 months ago

                    I am pretty sure people who do porn use pseudonyms anyway. If HR thinks the people use their real name and spread their porn on the internet, they are dumb for not realizing it’s fake. HR being HR as always.

        • zbyte64@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          11 months ago

          Seems we’re partially applying market dynamics of supply and demand. Simply assuming the “surplus” supply of deep fakes will decrease their value ignores the fact that the demand is still there. Instead what we get is new value opportunities in the arms race of validating and distributing deep fakes.

        • toonicycle@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          2
          ·
          11 months ago

          I mean they obviously shouldn’t have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you’re a young woman. Also in these cases where they aren’t adults it would be considered cp.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      38
      arrow-down
      3
      ·
      11 months ago

      So they do it and share it around to slut shame you

      You try to find a job and they find porn of you

      It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        9
        ·
        11 months ago

        IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

        • Couldbealeotard@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          11 months ago

          There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

          • DogMuffins@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            Seriously? Maybe we don’t read the same stuff but that’s not something I’ve noticed.

            I just can’t imagine how that’s possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      27
      ·
      11 months ago

      I don’t think the problem is that the girls and ashamed of the fake porn. The problem is not even that other kids will believe it. The problem is that kids will use it to mock, bully and ostracise them. It’s not being shared as ‘OMG, you’re so hot I made fake sex tape with you, marry me". It’s being shared as "you’re a slut that does porn, everyone thinks you’re a bitch, go kill yourself’.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        2
        ·
        11 months ago

        I see your point. In that way it’s just like any other bullying, though more personal. Unfortunately, society hasn’t done a good job of coming up with workable solutions for bullying. In this case, dragging the culprit behind the bleachers and letting the girls take turns kicking him in the nuts would be my go-to, but you can’t do that sort of thing anymore.

        • zbyte64@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 months ago

          You response highlights how the victims needs the power of community to respond appropriately, and how society excuses some forms of violence (involuntary porn) and not others (women getting retribution).

      • Basil@lemmings.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        11 months ago

        So as a grown woman

        Right? Literally not what’s being discussed. Obviously they’ll be more mature and reasonable about it. Teenagers won’t be

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        3
        ·
        11 months ago

        I wasn’t very representative even when I WAS a teenager. I was bullied quite a bit, though.

        • atzanteol@sh.itjust.works
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          11 months ago

          And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it’s pretty unrealistic to expect everyone to be able to do so. And it’s not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

    • foo@programming.dev
      link
      fedilink
      arrow-up
      18
      arrow-down
      6
      ·
      11 months ago

      What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

      • DogMuffins@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        6
        ·
        11 months ago

        I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can’t tell fakes from originals.

        Like as an adult, if I saw an “offensive” image of a co-worker, my first assumption would be that it’s probably AI generated, my first thought would be “which asshole made this image” rather than “I can’t believe my co-worker did [whatever thing]”.

      • calypsopub@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        15
        ·
        11 months ago

        Not really. The more extreme it is, the more easily people will believe you when you say it’s a deep fake. Everyone who matters (friends and family) will know it’s not you. The more this sort of thing becomes commonplace, the more people will simply shake their heads and move on.

        • mrsgreenpotato@discuss.tchncs.de
          link
          fedilink
          arrow-up
          11
          ·
          11 months ago

          People kill themselves over much more mundane things than this. I think you overestimate teenagers unfortunately, not everyone can handle it as lightly as you would. Telling people to just “shake it off” will simply not work most of the time.

          • calypsopub@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            11 months ago

            Sadly, you have a point. Somebody with good support at home and a circle of friends can weather this sort of thing, but others may feel helpless or hopeless. There needs to be an effective place to turn to for kids who are being bullied. Unfortunately that doesn’t seem to exist.

        • ParsnipWitch@feddit.de
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          11 months ago

          That depends on a how a specific person is seen and treated by their surroundings.

          A teenage girl who is already a victim of harassment or bullying for example will be treated very differently when humiliating images of her surface in her peer group, compared between someone who is well liked in school.

          People who do this have to be judged much more harshly. This can’t become the next item on a list of common sexual harassment experiences every girl and women “has to” experience.