Items tagged with csam

Apple announced this morning that it would delay a planned rollout of new protections aimed at minimizing the spread of Child Sexual Abuse Material (CSAM). The move comes after the company received high-profile backlash for the initiatives, and Apple executives even admitted that the initial messaging was bungled. "Last month, we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," said Apple in a statement. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input... Read more...
Apple is deep into damage control mode to clarify its messaging on its plan to protect children from online predators. Earlier this month, the company announced several features to stop the spread of Child Sexual Abuse Material (CSAM). While well-intentioned, this move sparked a lot of backlash from the online community over customer privacy. One such feature involves communications involving children through iMessage. Apple said it uses on-device machine learning to determine whether images sent or received on a child's device contain sexually explicit photographs. When enabled, parents would receive a notification if such images trigger the Apple filter. For children under the age of 17, CSAM... Read more...