Birdwatch Note Rating
2025-02-24 14:53:55 UTC - NOT_HELPFUL
Rated by Participant: 420F9C491151DD3A894C2018B9C6516DB65FFA0C5791F075692F2C7708A7C508
Participant Details
Original Note:
Claims suggesting that all companies, including Apple, are legally required to actively scan user data for CSAM are misleading. Appleās approach focuses on maintaining user privacy while complying with reporting obligations when CSAM is discovered through other means. https://www.apple.com/child-safety/?utm_source=chatgpt.com
All Note Details