Send your Apple fanboy friends to prison with this one simple trick We break Apple’s NeuralHash algorithm used to detect CSAM for iCloud photos. I show how it’s possible to craft arbitrary hash collisions from any source / target image pair using an adversarial example attack. This can be used for many purposes, such as evading detection, or forging false positives, triggering manual reviews.
0:00 - Intro
1:30 - Forced Hash Collisions via Adversarial Attacks
2:30 - My Successful Attack
5:40 - Results
7:15 - Discussion
DISCLAIMER: This is for demonstration and educational purposes only. This is not an endorsement of illegal activity or circumvention of law.