‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
deleted by creator
It’s in the article.
It’s trivial to circumvent automatic detection
The EU now has a rule that all reports of content must be checked and verified for illegal content like misinformation. They can’t automatically block that content because then people would weaponise reports. At best they can automatically block video and image hashes which have been previously verified as illegal, but these are trivial to circumvent. I think they’ve started using perceptual hashes but these are far from perfect.
I believe they use similar moderation for the US to proactively head off potentially similar legislation to the EU.
Something like 3 billion people actively use Facebook each month. There must be tens of millions of daily reports. I can only imagine the level of planning, staffing, and tools which are required to facilitate that.
They need an AI to curate that kind of content then.
AI is far from perfect, and is unlikely to satisfy the DSA requirements.