Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:
Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.
It will not possible to upload any new avatars or banners while this limit is in effect.
I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.
There was a user that posted a tool they had already been working on, that worked in Python, to go through and automate detection/deletion of potential CSAM on Lemmy servers that admins could implement until better tools come along. Unfortunately, I don’t remember who posted it or where I saw it in my travels yesterday.
Was it this post by db0?
https://lemmy.ml/post/4027478
Yep, that’s the one. Thanks!
You’re welcome
Google has an API for it. While I am not a fan of google, it is a widely used API.
Sounds like a useful back-pocket fallback/emergency tool. A thing for when your primary is failing or need more help.