- Location
- Inside my skull
I'm not surprised at your response, but Guardian should have had someone a bit more knowledgable talk to their writer before they went off on this particular rant.
Taking hashes of files to find/block/delete them is perfectly legitimate and has been in use in industry for ages.
The writer has just added their ignorance to turn it into a huge drama.
I don't think it's the file-hashing technology that people are concerned about
I don't think it's the file-hashing technology that people are concerned about
Only if all men are monsters.Surely if someone gets a camera or phone out it's game over?.
Only if all men are monsters.
Is that what you are saying?
Victims of image-based sexual abuse do not 'put themselves in this position' - it is the men misusing their images that put them in that position. HTH.There are a few topics to talk about here. The first one is not to put yourself in this position to begin with, especially women.
They do but it's not accurate.Image recognition: it would hash features within the image, not the whole file. Cropped images would still trigger an alert.
But I don't think it's a workable solution.
I'd like to see FB employing AI to identify nudity.
If detected, the submitter should be challenged to confirm their contact details, and maybe even pay a fee and/or deposit ; returnable only if the image passes moderation (this allows some nudity for educational/artistic reasons).