Tech News

Apple’s Privacy Mythology doesn’t match reality

[ad_1]

In 2021, Apple he has thrown himself as a superhero of world privacy. His leadership emphasizes “Privacy has been central to our work … from the very beginning” and that “basic human right “. Its new advertising he is also proud of his privacy and iPhone same things. Last spring, with the release of a software update (iOS 14.5), they showed something important that allowed users to opt out of apps that scan their activity across the Internet: people choose privacy when they don’t have to fight for control of their information. Now, alone 25 percent they give users permission, but before that, nearly 75 percent gave permission to target their information ads. Just as Apple plans to add more privacy protections to iOS 15, so be it was released next month, continues to mark itself as a potentially capable force slowing down down growth On Facebook, paragon surveillance capitalism. Unfortunately, Apple’s privacy promises don’t show the whole picture.

The company’s most troubling privacy may also be one of the most profitable: iCloud. For years, cloud-based storage services have been designed to consolidate hundreds of millions of Apple customers into its ecosystem, which is designed to effortlessly download your hard drive extension, photos, movies, and other files to your unseen backup. Unfortunately, iCloud makes it almost easy for police to access these files.

In the past, Apple has insisted that building the back door doesn’t weaken the security of its devices. But with old devices, the door is already built. According to Apple’s law enforcement manual, Anyone with iOS 7 or earlier is unlucky if they fall for police or ICE crosses. With a simple command, Apple will unlock a phone. It may seem like a Silicon Valley course, but most tech giant CEOs have not claimed that warranties for their devices “endanger the data security of hundreds of millions of people who comply with the law … a dangerous precedent that threatens the civil liberties of allThis service is available due to security vulnerabilities that will be fixed later in the operating system.

Since 2015, Apple has drawn the FBI and the Justice Department Ire Apple is also cracking down on each new security enhancement to build a device that has too much security. But Apple’s promise of privacy with almost everyone is a dirty little secret that has been behind the door all along. IPhone data from Apple’s latest devices or iMessage data that the company is constantly advocating “End-to-end encrypted”, all of this data is weak when using iCloud.

Apple’s simple design to support iCloud encryption keys has created complex consequences. They don’t do that with your iPhone (despite government requests). They don’t do it with iMessage. Some of the advantages of making an exception for iCloud are clear. If Apple didn’t have the keys, users who forgot their password wouldn’t be lucky. Having secure cloud storage means that the company itself would be no better than a random attacker resetting your password. And yet, while retaining that power, they can achieve the awesome ability to deliver a full iCloud backup when ordered.

iCloud data goes beyond photos and files and includes location data, such as “find my phone” or from AirTags, Controversial Apple tracking devices. With a single court order, all of your Apple devices can be turned against you and a weapon surveillance system can be set up. Apple could fix it, of course. Many companies have secure file sharing platforms. Swiss company Treasure provides true end-to-end encryption for cloud service. Treasury users see real-time uploaded files uploaded to the cloud in sync across multiple devices. The difference is that users, not Treasury, have encryption keys. This means that if users forget their password, they also lose files. But while providers have the power to recover or change passwords, they do have the power to hand over that information to the police.

The threat is growing. Under a new set of content moderation tools, Apple will scan iCloud uploads and iMessage communications for alleged child sexual abuse material. While the company only searched once photos by uploading iCloud to the suspicious CSAM, new tools can convert photos and text that you have sent or received against you. Frustrating CSAM is a noble goal, but the consequences could be disastrous for those who are wrongly accused when AI fails. But it can also be deadly when the software works the way it wants to. As Kendra Albert Harvard Law School professor pointed out on Twitter, these are the ones “The traits will get kids to be kicked out of the house, beaten or worse.” Software launched in the name of “child safety” could be a deadly threat to LGBTQ + children for homophobic and transphobic parents. As worryingly, the tools used to monitor CSAM today can be trained to mark the political and religious content of tomorrow.



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button