Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.

  • ScrollinMyDayAway@lemm.ee
    link
    fedilink
    English
    arrow-up
    211
    arrow-down
    1
    ·
    1 year ago

    This is sick. Kudos to mods for dealing with this garbage. I hope the posters are all hunted down and punished.

  • TheAndrewBrown@lemm.ee
    link
    fedilink
    English
    arrow-up
    113
    ·
    1 year ago

    I think this is a great move until we have something rock solid to prevent this. There are tons of image hosting sites you can use (most of which have the resources to already try to prevent this stuff) so it shouldn’t really cause much inconvenience.

  • Cris@lemm.ee
    link
    fedilink
    arrow-up
    43
    ·
    1 year ago

    I know there are automated tools that exist for detection CSAM- given the challenges the fediverse has had with this issue it really feels like it’d be worthwhile for the folks developing platforms like lemmy and mastodon to start thinking about how to integrate those tools with their platforms to better support moderators and folks running instances.

  • Io Sapsai 🌱@lemm.ee
    link
    fedilink
    arrow-up
    28
    ·
    1 year ago

    This is really sad and disgusting. It affects the whole platform but especially smaller instances that can’t keep up. Despite being a lemm.ee user, I was particularly upset about thegarden.land shutting down because of that spam. It had my favourite gardening community on here.

    I really hope this gets sorted out, and the spammers end up where they belong.

    • pomodoro_longbreak@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I’d really love to start a small instance just to play host to a couple of niche interests I don’t see around yet, but yeah, hearing about this fucked to behavior is making me hold off.

      It has a real chilling affect on users, which is so unfortunate for a platform that is mostly made up of well meaning people

  • Anonymousllama@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    1 year ago

    Perfectly fine. People can upload images elsewhere and then just link to them. Most image upload sites will have all those protections in place already. A good stopgap until Lemmy gets those mod tools

    • infinipurple@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Honestly, some people are just the worst. Why on earth anyone would waste their time doing something so vile is absolutely beyond me…

      • HelloHotel@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        If one enjoys the twisted pain inflicted on children, then inflicting pain that makes most adults want to use eye-bleach by showing off their plunders is to them well executed revenge on the people they dont like.

  • randint@lemm.ee
    link
    fedilink
    arrow-up
    25
    arrow-down
    2
    ·
    1 year ago

    It’s honestly sad that some well-intentioned laws can be used to attack online platforms.

    • Throwaway@lemm.ee
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      I kinda wonder though, how would go about making a law against cp but doesn’t hurt small sites like lemm.ee?

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        arrow-up
        21
        ·
        edit-2
        1 year ago

        The issue is that you really can’t. The laws are written specifically to prevent plausible deniability. Because pedos would be able to go “lol a troll sent it to me” and create some doubt in a jury. Remember that (at least in America) the threshold for conviction is supposed to be “beyond a reasonable doubt.” So if laws were focused on intent, all the pedos would need to do is create reasonable doubt, by arguing that they never intended to view/own the CSAM.

        This was particularly popular in the Napster/Limewire days, when trolls would upload CSAM under innocuous titles, so people looking for the newest episode of their favorite show would find CSAM instead. You could literally find CSAM titled things like “Friends S10E9” because trolls were going for the shock factor of an innocent person opening a video only for it to end up being hardcore CSAM. Lots of actual pedos tried using the “I downloaded it by accident” defense.

        So instead, the laws are written to close that loophole. It doesn’t matter why you have the CSAM. All that matters is you have it. The feds/courts won’t give a fuck if it was due to you seeking it out or if it was due to a bad actor sending it to you.

          • PM_Your_Nudes_Please@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            And that’s pretty much where we are now. Bad actors creating bot accounts on multiple instances, to spam the larger (most popular) instances with CSAM.

          • ZodiacSF1969@sh.itjust.works
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            1 year ago

            I think they have oversimplified the situation to the point that it is wrong.

            1. Arguably, Lemmy instance providers (depending on where they live) are protected in the same way Facebook or other content hosts are. So long as you are acting in good faith you are protected against any illegal content your users upload. This does mean you need to remove illegal content as you become aware of it, you can’t just ignore what your users are doing.

            2. There have been cases where although a user technically ‘possessed’ CSAM, it was shown that they did so unknowingly via thumbnails or it being cached. The police do investigate where it came from. It’s not as simple as just sending it to someone and you can have them convicted.

        • ZodiacSF1969@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          Lemmy instances are likely already protected in many countries legally so long as they act in good faith, ie actively moderate.

    • hemko@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Fuck the legal part, I wouldn’t want to stay on platform infested with cp. Thank you so much for all the awesome people combating this <3

      • HelloHotel@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Its a bug in sombodies markdown parsing.

        Your URL was somehow HTML escaped.

        Correct

        …?width=640&height=480

        Incorrect

        …?width=640&amp;height=480

        • Sotuanduso@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I uh… don’t know what you mean there. I was just pointing out that the image I posted is hosted externally, so it doesn’t mean I found a bypass to the disabled uploads. It displays fine on the website.

          • HelloHotel@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Some software somwhere has a bug in it and it broke your link, im sorry if i failed to communicate that

  • Holodeck_Moriarty@lemm.ee
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    This might be a good thread to ask:

    Does anyone know if any of the Lemmy apps support direct imgur uploads for Lemmy?

    I remember RIF used to do that for reddit back in the day before reddit supported direct image hosting.

    • lagomorphlecture@lemm.ee
      link
      fedilink
      arrow-up
      17
      ·
      1 year ago

      I’m going to go out on a limb and say they and all the other instances that were hit with this attack probably did. Which authorities, I don’t know. If this instance is hosted in Estonia then probably Estonian authorities, but it’s probably being hosted on the cloud so is it REALLY hosted in Estonia? There are a ton of American and EU users so hopefully the FBI and whatever the EU equivalent is. But honestly cybercrimes can get confusing because of the nature of people and hosting being spread out all over the world and it can be hard to even figure out who to report to.

      • infinipurple@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Europol in Europe. But you can report it to your national cybercrime division and they can refer it to the appropriate authority if necessary.

    • coffee@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      I don’t think they made it onto this server, with the 100kb upload limit in place, that was already a rather low risk. It’s a preventive measure. So far lemmy.world was the one deliberately targeted.

    • redballooon@lemm.ee
      link
      fedilink
      arrow-up
      20
      arrow-down
      2
      ·
      1 year ago

      There’s no need to invoke conspiracy. This is entirely possible for a single person to do, and motivations for single people may be very pity even if the consequences are widely visible.

      One misguided teenager on a power trip who enjoys how much disruption he can cause is enough for such an effect.

  • comfortablyglum@sh.itjust.works
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    Thank you for the efforts you are making. This is a serious situation; more than just dealing with bad actors, you are viewing traumatic images.

    Please, for your sanity and well being, prioritize your self care. Things like this linger in the psyche much longer than you would expect.

  • GenBlob@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    That’s fucking disgusting. Take any measures you can to prevent that shit from being on the site.

  • iByteABit [he/him]@lemm.ee
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    This is a very good decision, I worried about this problem from the very beginning that I learned about the Fediverse. Research must definitely be done to find CSAM detection tools that integrate into Lemmy, perhaps we could make a separate bridge repo that integrates a tool like that easily into the codebase.

    I hope every disgusting creature that uploads that shit gets locked up