A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • AVincentInSpace
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    9
    ·
    1 year ago

    …who by definition is AI generated and does not, in fact, exist?

    • Basil@lemmings.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      edit-2
      1 year ago

      What? But they literally do exist, and they’re hurting from it. Did you even read the post?

    • Nyanix@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      While you’re correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

    • drislands@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      The article is about real children being used as the basis for AI-generated porn. This isn’t about entirely fabricated images.

    • DogMuffins@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      1 year ago

      Of course they exist. If the AI generated image “depicts” a person, a victim in this case, that person “by definition” exists.

      Your argument evaporates when you consider that all digital images are interpreted and encoded by complex mathematical algorithms. All digital images are “fake” by that definition and therefore the people depicted do not exist. Try explaining that to your 9 year old daughter.

          • AVincentInSpace
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            1 year ago

            That image was generated by AI.

            So do people in images that are purely AI generated exist, or not?

            • DogMuffins@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              This is so tedious. If you have a point, then make it. Stop asking inane questions.

              So do people in images that are purely AI generated exist, or not?

              This question is based on a false premise, as though the technology used to create an image is relevant to what it depicts.

              • If michaelangelo paints the likeness of a model, does the model in the image exist?
              • if a child draws a stick figure likeness of their dad, does the dad in the image exist?
              • if you take a photo on your phone, and it uses complex mathematical algorithms to compress and later render the image, do people in those images exist?
              • if you run a filter over that image on your phone, does that person still exist ?

              Of course in all cases, for all intents and purposes the depicted person exists. You can argue that a painting is just an arrangement of pigments on canvas and you would be correct, but to everyone else its still a picture of a specific person.

              If you use a computer to generate an image that “looks like” a school-mate doing whatever thing, then an argument that the person in the picture does not exist because the image was generated by AI is moot, because for all intent’s and purposes it’s a “picture of” that school mate doing that thing.

                • DogMuffins@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  For the love of everything holy. This is not how grown ups discuss things. Make your point and stop asking dumb questions.

                  As you well know, no one is directly harmed by the simple act of someone viewing AI generated porn which does not depict a real person.

                  That said, the law in my jurisdiction does not discern between real or not. If it’s an image (even hentai) depicting sexual abuse against a minor then it’s CSAM. How do you know if the depicted person is a minor? That’s a question for a jury. I’m sure there are arguments against this position, but it’s merits are obvious. You don’t need to quibble over whether an image depicts a real person or not, if it’s CSAM then it’s illegal.

                  • AVincentInSpace
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    1 year ago

                    Then why did you say that there was no difference realism-wise between an image generated by AI and an image generated by a camera?

        • Taco@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          You fucking dunce. You did not read the article. People have been taking real pictures of real children, and using AI to remove their clothes. The real person is still in the image