A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

    • @DogMuffins@discuss.tchncs.de
      link
      fedilink
      English
      17 months ago

      For the love of everything holy. This is not how grown ups discuss things. Make your point and stop asking dumb questions.

      As you well know, no one is directly harmed by the simple act of someone viewing AI generated porn which does not depict a real person.

      That said, the law in my jurisdiction does not discern between real or not. If it’s an image (even hentai) depicting sexual abuse against a minor then it’s CSAM. How do you know if the depicted person is a minor? That’s a question for a jury. I’m sure there are arguments against this position, but it’s merits are obvious. You don’t need to quibble over whether an image depicts a real person or not, if it’s CSAM then it’s illegal.