Apple confirms CSAM detection only applies to photos, defends its method against other solutions
Read Time:27 Second

Apple confirms CSAM detection only applies to photos, defends its method against other solutions

0 0

Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.

The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.

more…

The post Apple confirms CSAM detection only applies to photos, defends its method against other solutions appeared first on 9to5Mac.

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous post Microsoft brings xCloud to Windows PCs with the Xbox app
Next post Rocket League’s next season brings changes to tournaments and a new cowboy-themed car
Generated by Feedzy