Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears
Read Time:37 Second

Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

0 0

Illustration by Alex Castro / The Verge

Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.

Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn…

Continue reading…

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous post Twitter is changing the contrast of buttons again after complaints of eye strain
Next post Boeing to ground Starliner indefinitely until valve issue solved
Generated by Feedzy