Subscribe now

Technology

Apple's child abuse detection software may be vulnerable to attack

By Matthew Sparkes

20 August 2021

Close up of digital data and binary code in network.

Apple has plans to detect images of child sexual abuse on some of its devices

Yuichiro Chino/Getty Images

Apple’s soon-to-be-launched algorithm to detect images of child sexual abuse on iPhones and iPads may incorrectly flag people as being in possession of illegal images, warn researchers.

NeuralHash will be launched in the US with an update to iOS and iPadOS later this year. The tool will compare a hash – a unique string of characters created by an algorithm – of every image uploaded to the cloud with a database of hashes for known images of child sexual abuse. Matches should…

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox. We'll also keep you up to date with New Scientist events and special offers.

Sign up

To continue reading, subscribe today with our introductory offers

Popular articles

Trending New Scientist articles

Piano Exit Overlay Banner Mobile Piano Exit Overlay Banner Desktop