‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
Sounds like the perfect job for AI
I feel sorry for whichever researchers are in charge of training and fine tuning those models… ouch
Maybe they still have the content that got removed because of that, you might be able to train an AI just on that. That way they don’t need to manually check it, it’s already been done after all.
Exactly, and even if the content uploader disagrees and request human oversight that is just one image that needs to be checked rather then all. Ai may even be able to blur parts of footage that are most brutal and extreme and create written transcripts of audio You dont need 4k resolution and hearable screaming understand that someone is getting murderered or Raped.
True, didn’t think about that. Blurring and transcripts are also way more reliable, so it will work correctly almost always.
They used AI to flag the images but a human still had to search through them
I am of the kind that is very wary with what should or should not be an AI’s job, and you know what, in this very particular case, I think I agree.
At least as a first filter, anyway.
Huge industries emerging in this field right now for everything from this type of social media moderation to helping fight CSAM more effectively so humans aren’t having to be a frontline for that type of material. This is one area I can really, really get behind AI on and see a very valid use case that isn’t just marketing hype like so many others. I know there’s some great stuff happening just based on my own field of employment and being close to a few things in the works this year.