Apple's CSAM plan is looking for materials related to child sexual abuse

Apple is known as one of the companies that care about user's privacy and digital security. From the problem of iPhone hacking by NSO Group software to the company's new system for examining users' photos, Apple has gone through a lot.


The company revealed a short time ago about the CSAM plan, and during the implementation of this plan the photos on all users' phones will be scanned for child sexual content, and CSAM is an acronym for Child Sexual Abuse Material, that is, material related to sexual abuse of children.


This update was revealed through the company's official website in a completely calm manner, but this did not prevent the news from spreading, and of course, opinions differed about it.


Apple's CSAM plan and its relationship to digital rights


Perhaps Apple's move is a successful step in the interest of children's safety, but doing so by checking the photos of all users on their phones and on iCloud will not be pleasing to all users.


Honest users may find themselves having to make Apple check their photos periodically just because they want to, and perhaps Apple has exposed itself to a big problem unnecessarily when making this decision, as users may lose confidence in it, which is a huge loss for it.


These events come in light of the fact that iPhone phones are easily exposed to hacking through the software of the Israeli company NSO, and this point was very sensitive when Apple announced its CSAM plan, as Apple is now fully examining users’ photos, in light of its inability to prevent NSO hacks.


At the present time, the matter can be considered finished, because reversing the new system after its announcement is not possible in more than one respect, and governments have welcomed this system and expressed their support for it.


In light of all these events, the call began for the digital rights of every user, and to be legislated legally and effectively, and among these digital rights are the prevention of tracking and surveillance, and in one way or another, through the new CSAM plan, all iPhone users will be monitored


These tracking and monitoring have caused many problems, including the Cambridge Analytica scandal for Facebook, NSO's ability to breach users' security and privacy, and even the CSAM scheme that puts poison in honey.


For his part, Tim Cook, CEO of the company, has remained completely silent and did not comment on the new company's plan, nor did he respond to the criticism that the company was subjected to after the official announcement.

2 views0 comments

Recent Posts

See All

Steam 2021 autumn sale begins

The Steam 2021 autumn sale has officially opened, which lasts from 2 a.m. Beijing time on the 25th to 1 a.m. Beijing time on December 2nd. T