Tech News

Apple’s Photo Scanning Plan sparks the cry of political groups

[ad_1]

More than 90 Political groups in the U.S. and around the world signed an open letter asking Apple to stop plan To own Apple devices scan photos for child sexual abuse material (CSAM).

“Signed organizations committed to civil rights, human rights and digital rights around the world are writing to call for Apple to cancel its plans to build surveillance capabilities on August 5, 2021. iPhones, iPads, and other Apple products letter Apple Cook told CEO Tim Cook. “While these capabilities are aimed at protecting children and disseminating child sexual abuse material (CSAM), we are concerned about censoring protected speech, using it to threaten the privacy and security of people around the world, and causing disaster. The consequences for many children.”

Center for Democracy and Technology (CDT) announced the letter, Sharon Bradford Franklin, director of the CDT Security and Surveillance Project, said, “We can expect governments to take advantage of the surveillance capability that Apple builds on iPhones, iPads and computers. other content that should be protected as free expression, which forms the backbone of a free and democratic society. “

The open letter was signed by groups from six continents (Africa, Asia, Australia, Europe, North America and South America). Some of the U.S. signatories include the American Civil Liberties Union, Electronic Frontier Foundation, Fight for the Future, LGBT Technology Partnership and Institute, New America’s Open Technology Institute, STOP (Surveillance Technology Oversight Project) and the Sex Workers Project. Of the City Justice Center. Signatories include Argentina, Belgium, Brazil, Canada, Colombia, Dominican Republic, Germany, Ghana, Guatemala, Honduras, Hong Kong, India, Japan, Kenya, Mexico, Nepal, the Netherlands, Nigeria, Pakistan, Panama, Paraguay, Peru, Senegal, Spain, Tanzania and the United Kingdom. The list of signatories is complete here.

Scanning Photos and Messages on iCloud

Apple he announced devices that have iCloud Photos enabled for two weeks will scan the images before uploading iCloud. An iPhone uploads all photos to iCloud after taking them out, so scanning would happen almost immediately if a user has iCloud Photos turned on before.

Apple he said his technology “scans an image and converts it to a unique number specific to that image” and marks a photo when its hash is the same or nearly the same as the hash of any one that appears in the popular CSAM database. The National Center for Missing and Exploited Children (NCMEC) can be held accountable when about 30 CSAM photos are detected to ensure that the threshold set by Apple is “less than one in 1 trillion meetings per year”. account. “The threshold may change in the future to maintain a false 1 trillion positive rate.

Apple is also adding a tool to the Messages app to “examine image attachments and determine if a photo is sexually explicit,” without giving Apple access to the messages. The system will be optional for parents, and if turned on, “when receiving or sending sexually explicit photos, children and parents will be notified.”

Apple said the new systems will be released later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey. It will only be in the US initially.

Both scanning systems are about open letter signers. In the message scanner that parents can enable, the letter said:

Algorithms designed to detect sexually explicit material are unreliable. Art, health information, educational resources, proclamation messages, and other images tend to be misleading. The rights of children to send and receive such information are protected by the United Nations Convention on the Rights of the Child. In addition, the system developed by Apple assumes that the “parent” and “child” accounts involved are actually that of the child’s adult parent and that these people have a healthy relationship. That may not always be the case; an abusive adult may be the account organizer, and the consequences of parental notification may jeopardize the child’s safety and well-being. LGBTQ + young people are particularly at risk on family accounts of non-native parents. As a result of this change, iMessages will no longer provide users with confidentiality and privacy through an encrypted messaging system that only provides information sent by senders and intended recipients. When this back-door feature is integrated, governments can force Apple to spread notifications to other accounts and detect images that are objectionable for reasons other than sex.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button