While some industry watchers applauded Apple's efforts to take on child exploitation, they also worried that the tech giant might be creating a system that could be abused by totalitarian regimes. The Financial Times earlier reported Apple's plans. If the program is convinced it's identified abusive imagery, it can share those photos with representatives from Apple, who'll act from there. Apple said the system is automated and is 'designed with user privacy in mind,' with the system performing scans on the device before images are backed up to iCloud. Apple will also warn parents and children when they might be sending or receiving a sexually explicit photo using its Messages app, either by hiding the photo behind a warning that it may be 'sensitive' or adding an informational pop-up.īut the most dramatic effort, Apple said, is to identify child sexual abuse materials on the devices themselves, with a new technology that'll detect these images in Apple's photos app with the help of databases provided by the National Center for Missing and Exploited Children.
Apple said it'll update Siri and search features to provide parents and children with information to help them seek support in 'unsafe situations.' The program will also 'intervene' when users try to search for child abuse-related topics.