Tech News

Apple advocates new technology for child abuse against privacy issues

[ad_1]

Following this week’s announcement, some experts believe that Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted, the company can still identify child abuse materials, provide evidence of law enforcement, and suspend the offender, which could ease political pressure on Apple executives.

It would not alleviate all pressure: Most of the same government that Apple wants to do more about child abuse wants more action on content related to terrorism and other crimes. But child abuse is a real and major problem, as most large tech companies have failed.

“Apple’s approach to privacy is better than anyone else I know,” says David Forsyth, professor of computer science at the University of Illinois Urbana-Champaign, who reviewed Apple’s system. “I think this system will significantly increase the likelihood of people owning or having traffic [CSAM] are found; which should help protect children. Harmless users should have minimal loss of privacy, as visual derivatives are sufficiently consistent with CSAM images and appear only for images that match known CSAM images. The accuracy of the matching system, combined with the threshold, makes it very difficult to reveal photos that are unfamiliar with CSAM images. “

What about WhatsApp?

All the big tech companies have a terrifying reality of child abuse material on their platform. No one has approached it like Apple.

Like iMessage, WhatsApp is an end-to-end encrypted messaging platform with billions of users. Like any platform of this size, they have a big problem of abuse.

“I read the information Apple posted yesterday and I’m concerned,” WhatsApp chief Will Cathcart said he tweeted on Friday. “I think it’s a wrong approach and a delay for the privacy of people all over the world. People have asked if we’re going to take this system for WhatsApp. The answer is no.”

WhatsApp has the capabilities to make reports so that any user can report abusive content to WhatsApp. Although the capabilities are perfect, WhatsApp reported more than 400,000 cases to NCMEC last year.

“This is a surveillance system built and operated by Apple that can be easily used to scan private content for them or anyone who decides they want to be controlled by a government,” Cathcart said in his tweets. “Countries that sell iPhones will have different definitions of what is acceptable. Will this system be used in China? What content will be considered illegal there and how will we ever know? How will they manage to scan the requests of governments around the world to add other types of content to the list? “

In a statement to reporters, Apple stressed that this new scanning technology was so far only being released in the United States. But the company argued that it has a career in the fight for privacy and hopes to continue to do so. That way, a big part of this comes from trusting Apple.

The company argued that the new systems could not be misused through government action, and repeatedly stressed that not doing so was as easy as turning off iCloud backup.

Despite being one of the most popular messaging platforms on earth, iMessage has long been criticized for not having the reporting capabilities that are common on the social internet today. As a result, Apple has historically given NCMEC a small portion of the cases that companies like Facebook do.

Instead of taking that solution, Apple has built something completely different, and the end results are an open and troubling question for privacy hawks. For others, it is a radical change.

“Apple’s broad protection for children is a game changer,” NCMEC President John Clark said in a statement. “The reality is that privacy and child protection can go hand in hand.”

Great participation

An positive would say that enabling full encryption of iCloud accounts while perceiving child abuse material is a victory against abuse and privacy, and perhaps a clever political movement that prevents rhetoric against the encryption of American, European, Indian and Chinese officials.

A realist would worry about coming from the most powerful countries in the world. It’s a virtual guarantee that Apple will receive — and probably receive — calls from the capitals when government officials begin imagining surveillance options for that scanning technology. Political pressure is one thing, regulation and authoritarian control another. But this threat is not new and is not specific to this system. With the company a silent but profitable engagement with China, Apple has a lot of work to do to convince users of their ability to deal with draconian governments.

All of the above may be true. What comes next will eventually define Apple’s new technology. If this feature is disarmed by governments to expand surveillance, then the company is not living up to its promises of privacy.



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button