Connect with us

News

Apple Scraps Plans To Check iCloud Photos For Evidence of Child Abuse

Published

on

One of Apple’s most contentious ideas ever, to check iCloud photographs for evidence of child sex abuse, has been formally dropped.

Apple revealed this summer that on-device scanning, a new iOS function that silently combed through each user’s images for indications of questionable content, would be available soon.

The new capability was created to warn human personnel, who would then likely alert the authorities, should the scanner discover any indications of CSAM.

To that effect, privacy and security experts reacted angrily to the proposal right away, claiming that the scanning tool could potentially be used to look for different types of content.

Critics claimed that even having such scanning capabilities in iOS constituted a slippery slope towards wider monitoring abuses, and it was widely believed that the technology could rapidly turn into a backdoor for law enforcement.

Interestingly enough, Apple fought back against these complaints at the time, but in the end, the firm backed down and declared that it would “postpone” the adoption of the new feature until a later time.

It appears right now that that time will never arrive. In addition to announcing a slew of new iCloud security measures on Wednesday, the company also said that it will not be proceeding with its plans for on-device scanning.

Apple made it plain in a message shared with Wired magazine that it had taken a different decision.

It iterated as follows:

After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021.

We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data.

And we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

Conclusion

Stay tuned for more interesting news updates.

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow Us

Advertisement
Advertisement

Recent

Advertisement
Advertisement
Advertisement