Friday, 13 August 2021

Apple Says At Least 30 iCloud Photos Matching With Child Abuse Material Will Flag Accounts

Apple has further detailed that its child safety mechanism will require at least 30 photos matching with Child Sexual Abuse Material (CSAM) flagged by organisations in at least two countries before an account is flagged for human review.

No comments:

Post a Comment

AI-Powered iOS 18 Could Be the Biggest Update in Apple's History: Report

Apple will introduce its newest iPhone 16 lineup, expected to be packed with Generative AI features, later this year. The Cupertino, Califor...