Apple employees are worried about new photo scanning technology


Thousands of iPhone users and dozens of organizations have spoken out against news that Apple will scan user photos for child sexual abuse (CSAM) material. Now it became known that this topic worries Apple employees.


Apple employees have raised concerns about the new CSAM search engine in the internal channel of the corporate messenger Slack. Some of these officers, who asked not to be identified, said they fear the feature could be used by authoritarian governments to spy on people, censor, and arrest people.


Another employee added that the topic generated a lot of discussions, and the internal channel had more than 800 posts. A group of employees created a special thread to discuss the new feature, but some of them think that Slack is not the best place for such discussions. The source adds that many employees are in favor of this feature, as they believe it will "help deal with illegal content."


Apple began to use Slack more widely after moving employees to remote work. Despite Apple's strict guidelines, discussions about what employees say on the company's internal Slack go public. At the same time, Apple asked employees not to use Slack to discuss labor issues and other sensitive topics. However, this does not stop employees from expressing dissatisfaction with the company.


In addition to Slack, some employees also shared their views on company policy on Twitter.

4 views0 comments

Recent Posts

See All

L0phtCrack is now open source

L0phtCrack is a password checking and recovery application that has been around since 1997. The program for Windows has a checkered history