Researchers produce collision in Apple’s child-abuse hashing system
Read Time:38 Second

Researchers produce collision in Apple’s child-abuse hashing system

0 0

Illustration by Alex Castro / The Verge

Researchers have produced a collision in iOS’s built-in hash function, raising new concerns about the integrity of Apple’s CSAM-scanning system. The flaw affects the hashing system, called NeuralHash, which allows Apple to check for exact matches of known child-abuse imagery without possessing any of the images or gleaning any information about non-matching pictures.

On Tuesday, a GitHub user called Asuhariet Ygvar posted code for a reconstructed Python version of NeuralHash, which he claimed to have reverse-engineered from previous versions of iOS. The GitHub post also includes instructions on how to extract the NeuralMatch files from a current macOS or iOS build.

“Early tests show that it can tolerate image resizing and compression,…

Continue reading…

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous post T-Mobile data breach exposed the personal info of more than 47 million people
Next post Mattie Lubchansky.
Generated by Feedzy