A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • _cnt0@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    4
    ·
    8 months ago

    Non-paywall link: https://web.archive.org/web/20240305000347/https://www.wired.com/story/pornhub-chatbot-csam-help/

    There’s this lingering implication that there is CSAM at Pornhub. Why bother with “searches for CSAM” if it does not return CSAM results? And what exactly constitutes a “search for CSAM”? The article and the linked one are incredibly opaque about that. Why target the consumer and not the source? This feels kind of backwards and like language policing without really addressing the problem. What do they expect to happen if they prohibit specific words/language? That people searching for CSAM will just give up? Do they expect anything beyond them changing the used language and go for a permanent cat and mouse game? I guess I share the sentiments that motivated them to do this, but it feels so incredibly pointless.

    • TheBlackLounge@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      3
      ·
      8 months ago

      Lolicon is not illegal, and neither is giving your video a title that implies CSAM.

      That begs the question, what about pedophiles who intentionally seek out simulated CP to avoid hurting children?

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          8 months ago

          Which is, imo, pretty dumb. If it gives these people an outlet that literally hurts no one, I say they should be allowed to use it. Without it they’ll just go to more extreme lengths to get what they need, and as such may go to places where actual real life children are being abused or worse.

          So while it’s still disgusting and I’d rather not think about it, if nobody’s being hurt then it’s none of my business. Let them get out their urges in a safe way that doesn’t affect anybody else.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            8 months ago

            I imagine the concern is that it would look identical to the real thing. Which blurs the lines. Kinda like how governments really hate when toy makers make toy guns look too real and why I have to tell airport security that I would like my bag searched now since there are homemade looking electronic devices in it.

            I guess in theory some government could make a certification system. Where legal simulated cp has like some digital watermark or something but you know that would involve a government paying someone to review child porn for a living. Kinda hard to sell that to the taxpayers or fill that role. Maybe the private sector would be willing to do it but that is a big ask.

            I am not sure I agree with you or disagree with you. Maybe all of us would be better off if there is a legal and harmless way for pedos to get what they want. Or maybe it is bad to encourage it at all even in a safe way, like if they consume that stuff it will make them more likely to seek out real children.

            Definitely isn’t a great situation be great if the condition is cured some day.

            • YarHarSuperstar@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              8 months ago

              This covered a lot of my concerns and thoughts on the topic. I want these people to be able to seek help and possibly even have a legal outlet that is not harming anyone, i.e. not even someone who has to view that shit for a living, so maybe we get AI to do it? IDK. It’s complicated but I believe that it’s similar to having an addiction in some ways and should be treated as a health issue, assuming they haven’t hurt anyone and want help. This is coming from someone with health issues including addiction and also someone who is very empathetic and sympathetic to any and all struggles of folks who are just trying to live better.

              • afraid_of_zombies@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                8 months ago

                I can’t even imagine the amount of money it would cost for someone to pay me to watch and critique child porn for a living. I have literally been paid money in my life to fish a dead squirrel that was making the whole place stink, from underneath a trailer in July and would pick doing that professionally over watching that filth.

      • Clbull@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 months ago

        Depends on the jurisdiction. Indecent illustrations and ‘pseudo photographs’ depicting minors are definitely illegal in the UK (Coroners and Justice Act 2009.) Several US states are also updating their laws to clamp down on this too.

        I’m also aware that it’s illegal in Switzerland because a certain infamous rule 34 artist fled his home country to evade justice for that very reason.

      • _cnt0@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        8 months ago

        Like anything on the internet wasn’t tracked. If need be people will resort to physically exchanging storage media.

        • Blueberrydreamer@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          8 months ago

          But having that tracking shown to you has a very powerful psychological effect.

          It’s pretty well established that increasing penalties for crimes does next to nothing to prevent those crimes. But what does reduce crime rates is showing how people were caught for crimes, making people believe that they are less likely to ‘get away with it’.

          Being confronted with your own searches is an immediate reminder that the searcher is doing something illegal, and that they are not doing so unnoticed. That’s wildly different than abstractly knowing that you’re probably being tracked somewhere by somebody among billions of other people.

          • _cnt0@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            8 months ago

            And where is the quantification and qualification for that? Spoiler: it’s not in the article(s) and not one google search away. Does Nintendo succeed in stopping piracy with its show trials? If you have a look around here, it more looks like people are doubling down.

            • Blueberrydreamer@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 months ago

              I mean, I know Google has been shitty lately, but Wikipedia isn’t hard to find: https://en.m.wikipedia.org/wiki/Deterrence_(penology)

              I’d wager Nintendo has put some fear into a few folks considering developing emulators, but that’s the only comparison to be made here. The lack of any real consequences for individuals downloading roms is why so many are happy to publicly proclaim their piracy.

              Now, I bet if megaupload added an AI that checked users uploads for copyrighted titles and gave everyone trying to upload them a warning about possible jail time, we’d see a hell of a lot less roms and movies on mega.

              • _cnt0@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                Now, I bet if megaupload added an AI that checked users uploads for copyrighted titles and gave everyone trying to upload them a warning about possible jail time, we’d see a hell of a lot less roms and movies on mega.

                It would simply obsolete megaupload. Sharing platforms come and go. If one distribution channel stops working, people will use (or create) another.

                • Blueberrydreamer@lemmynsfw.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  8 months ago

                  Obviously, most of Mega’s traffic is piracy, they have no interest in doing that. The point is it’s an actual comparison instead of the nonsense you brought up.

                  Of course no individual site is going to singlehandedly stop criminal acts. Glad you agree it would be exactly as effective as I suggested.

    • Jojo@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      8 months ago

      Why target the consumer and not the source?

      If for no other reason than it doesn’t have to be either/or. If you can meaningfully reduce demand for a “product” as noxious as CSAM, you should expect the rate of production to slow. There are certainly efforts in place to prevent that production from ever being done, and to prevent it from being shared/hosted once it is, but I don’t think attempting to reduce demand in this way is going to hurt.

      • _cnt0@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        8 months ago

        Does it reduce the demand though? Where are the measurements attesting to that? If history has shown one thing, it is that criminalizing things creates criminals. Did the prohibition stop people from making, trading, or consuming alcohol? How does this have any meaningful impact on the abuse of children? The article(s) completely fail to elaborate on that end. I’m missing the statistics/science here. What are the measuring instruments to assess any form of success? Just that searches were blocked and people were shown some links? … TL;DR: is this something with an actual positive impact or just an exercise in virtue signaling and waste of time and money? Blind “fixes” are rarely useful.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      Maybe liability or pretending to help? That way they can claim later on “we care about people struggling with this issue which is why when they search for terms related to it we offer the help they need”. Kinda how if you search for certain terms on Google it pops up suicide hotline on top.

      Ok Google just because I looked up some stuff on being sad in winter doesn’t mean I am planning to put a gun in my mouth.

      • _cnt0@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        Yah, this feels more like a legal protection measure and virtue signaling. There’s absolutely no assessment of efficiency or even efficacy of the measures. At least not in the article or the ones it links to and I couldn’t find anything substantial on it.