Apple Posts Child Safety FAQ To Quell Concerns Over iCloud Photo Scanning And Privacy
Late last week, we covered Apple’s new plan to screen text messages for explicit images and files uploaded to iCloud for child sexual abuse material. The announcement, however, garnered quite a bit of attention and concern rather quickly, contrary to what Apple likely wanted. Now, the Cupertino-based company has published a frequently asked questions document to hopefully quell concerns.
On August 5th, Apple announced that it would be scanning images for sexually explicit material sent to children. These photos would then be blurred, and a warning about the image’s content would be shown to the child. If the child decided to view the content, their parent would be alerted if they had the feature enabled. This worked quite similarly in reverse as well if the child wanted to send an image. Moreover, photos being uploaded to iCloud would be scanned by an on-device process that compares the image’s unique fingerprint to a list of fingerprints of known child sexual abuse material (CSAM). If there were a match, there would be a manual review and report if required.

Even if this sounds great, the system still needs an independent review as taking Apple’s word at face value is not a great idea. However, it would not be in the interest of the business to break the systems that are sought after in today’s day and age. Either way, let us know what you make of Apple’s FAQ responses in the comments below.