Apple is being sued for dropping its plan to scan iCloud photos for child sexual abuse material (CSAM) after the company cited security and privacy concerns.

Apple is being sued for dropping its plan to scan iCloud photos for child sexual abuse material (CSAM) after the company cited security and privacy concerns.