NeuralHash is BROKEN | How to evade Apple's detection and forge hash collisions (Video)

Send your Apple fanboy friends to prison with this one simple trick :wink: We break Apple’s NeuralHash algorithm used to detect CSAM for iCloud photos. I show how it’s possible to craft arbitrary hash collisions from any source / target image pair using an adversarial example attack. This can be used for many purposes, such as evading detection, or forging false positives, triggering manual reviews.

OUTLINE:
0:00 - Intro
1:30 - Forced Hash Collisions via Adversarial Attacks
2:30 - My Successful Attack
5:40 - Results
7:15 - Discussion

DISCLAIMER: This is for demonstration and educational purposes only. This is not an endorsement of illegal activity or circumvention of law.

Code: https://github.com/yk/neural_hash_col
Extract Model: https://github.com/AsuharietYgvar/App
My Video on NeuralHash: https://youtu.be/z15JLtAuwVI