Apple suspected of downplaying CSAM suspicions on its platforms

by Jerry

A British regulator complained that compared to its competitors, Apple is behind, in solving this issue.

Apple faces serious accusations
Apple is one of the main technology companies involved in the technology industry and is allegedly covering up the prevalence of child sexual abuse materials (CSAM) on its platforms. This accusation has been made by the NSPCC, a British association that is is short stands for the protection of children.

Damning figures
As it were, any American technology company that runs the risk of CSAM on its platforms is mandated to report the incidence to http://www.ncmec.org/, an organization through which cases are forwarded to the right law enforcement agencies around the globe. Worldwide, in 2023, Apple detected merely 267 instances of CSAM compared to the approximate one it has been averaging per day. 47 million potential cases reported by Google and the 30. 6 million reports from Meta platforms, and exposures within the same period were estimated at 1.2 billion. TikTok, Snapchat, PlayStation/Sony Interactive Entertainment also expressed more suspicious situations than Apple.

A questionable defence
Apple, in its defence, said that iMessage, FaceTime and iCloud are some of the services that incorporated end-to-end encryption thus they did not view the content of the messages being shared by users. However, this point is questionable: “WhatsApp also has this type of encryption, but nonetheless informed nearly 1. 4 million of suspected cases to NCMEC in 2023,” the experts counter.

Strengthening of protection measures?
The CSAM detection system that Apple planned to use was to take a snap shot before images were uploaded to the iCloud and match it with a database of known CSAM images, a plan that Apple had stated before the controversy arose in 2021. However, due to protests by privacy and digital rights organizations for privacy invasion, the California-based firm stalled the project before scrapping it in 2022 all together. ”Kids can remain safe while using the services without companies mining their data,” the company noted in an interview with Wired dated August 2022.

Apple has not said anything about the allegations provided by NSPCC and has only restated its goal of ensuring that users’ ‘security and privacy’ are protected.

You may also like

Dedicated brain and soul to all those who have video games in their blood, Corriereink.com is the Italian reference point for the entertainment of the present and the future. Get ready to be amazed every day with articles, news, videos, live shows and brillant production.
Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?