• lichtmetzger@feddit.de
    link
    fedilink
    arrow-up
    125
    ·
    edit-2
    1 year ago

    On feddit.de, lemmy.world is only temporarily defederated because of CSAM until a patch is merged into Lemmy that prevents images from being downloaded to your own instance.

    So I’ll just be patient and wait. It’s understandable the admins don’t want to get problems with law enforcement.

    • Gormadt@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      32
      ·
      1 year ago

      Makes quite a bit of sense

      Depending on jurisdiction it can be pretty hairy if your instance downloads it

      IANAL but I’m pretty sure that in the US you have a “duty to report” and you can have legal protections if you end up getting it and then reporting it

      But IANAL so I’d recommend looking into it with an actual lawyer if you run a website that hosts content

    • cadekat
      link
      fedilink
      arrow-up
      22
      ·
      1 year ago

      Won’t that lead to some horrible hug-of-death type scenarios if a post from a small instance gets popular on a huge one?

      • CoderKat@lemm.ee
        link
        fedilink
        English
        arrow-up
        41
        ·
        1 year ago

        Yes, but arguably it was never very scalable for federated software to store large media. It gets utterly massive quick. Third party image/video hosts that specialize in hosting those things can do a better job. And honestly, that’s the kinda data that is just better suited for centralization. Many people can afford to spin up a server that mostly just stores text and deals with basic interactions. Large images or streaming video gets expensive fast, especially if the site were to ever get even remotely close to reddit levels.

        • cadekat
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          If you’re only responsible for caching for your own users, you don’t unduly burden smaller instances.

        • 30p87@feddit.de
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          How would one realize CSAM protection? You’d need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.

          • NightAuthor@beehaw.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.

            Idk, but we gotta figure out something

      • kate@lemmy.uhhoh.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Maybe a system where the files federate after 3 upvotes from outside the original instance?

        • parlaptie@feddit.de
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          That’d still be exploitable. You could just run 3 of your own instances. Coming up with a system to stop malicious users that can’t be gamed would be tricky.

          • cadekat
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Caching only if some number of your own users upvote might work.

    • Cyanogenmon@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      This is what I’m waiting for before I host my own as well. Rather not have to worry that much about constantly having to admin out CSAM.

      • Monkey With A Shell@lemmy.socdojo.com
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        1 year ago

        Not to shill but I just found the other day that cloudflare has a csam scanning and reporting engine built into their proxies. In theory it gives them a window into the data stream by them decrypting and re-encrypting that could snatch a password hash, but 2FA makes that useless after a minute. Basically it scans anything that gets put in the cache and reports it, notifies you to pull it down, and automatically puts up a 451 block on the link.

    • OnU@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Feddit is defed from so many instances it’s actually not usable for me.