• @30p87@feddit.de
      link
      fedilink
      210 months ago

      How would one realize CSAM protection? You’d need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.

      • @NightAuthor@beehaw.org
        link
        fedilink
        English
        210 months ago

        There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.

        Idk, but we gotta figure out something