If your strongest flex for having an iPhone is its privacy policies, you’ll want to reconsider following this fall’s update. The iOS15 and Mac Monterey updates are to hit the markets later this year. But it won’t exactly be symphonic to Apple’s privacy-loving customers. In a detailed report made public last Thursday, the new update’s focal point is the peculiar subject of child sexual abuse content. The latest Apple privacy update will scan devices without the user’s consent for explicit material. As benevolent it may seem on the surface, it could possibly open doors to endless privacy concerns.
The update will allow iOS to scan devices for sexually explicit child abuse content. Or, in simpler words, it will flag child pornographic content based on the National Center for Missing and Exploited Children databases. Though scanning pictures shared over the cloud is not a new phenomenon to the tech world, Facebook and other social networks have been doing the same for years. But what sparks immediate attention is that it will also scan the offline directory of devices. And the privacy concern is not misplaced as machine learning is a slippery slope waiting to be abused by hackers. Like the Israeli Pegasus software, which was intended to detect terrorist activities. But ended up monitoring over 50,000 cellphones that belonged to activists, journalists, and even government officials. Similarly, all it needs to misuse the Apple privacy update is to expand the ML to scan for other politically influenced content on devices.
Read More: The Computer Security Mistakes You Might be Making
Read More: The Best VPNs to Use for Added Online Security
The other part of the privacy update will restrict minors from sharing sensitive content. Before sharing any sexually explicit images, the device will warn the user. Moreover, it will also give the user an option of proceeding with doing so. But should a user choose to proceed, it will automatically update their parents as well. Jillian York, an American activist, and author protested to the Guardian, saying, “This strikes me assumptive of two things. One, that adults can be trusted with these images and two, that every other culture has the same ideas about what constitutes nudity and sexuality as the US does.”