Quote:
Originally Posted by SpicyM
This is just one of the reasons why I consider it stupid to upload personal/family photos to social networks or share any sensitive data through email or communication apps.
|
Thanks for sharing. I think what they use is something to detect that specific type of case, dealing with sexual abuse on kids. Which is fine, but obviously prone to errors. That's a very unique situation. I don't think they are actually reading (scanning) emails and mailing the contents to law enforcement, as you made it sound. But, they do follow the CSAM law, which is a different story and acceptable:
"“We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,”
*I don't know what the CSAM law is, but I'm guessing the obvious.