Tech News

Apple says researchers can study the safety features of children. He’s suing a startup that does that.

[ad_1]

Apple could hand over the code for review, even if it says something it won’t do. Researchers can try the reverse engineering function in a “static” way, that is, without running real programs in a live environment.

The truth is, though, that all of these options have at least one major problem in common: they don’t let you see the code running directly on an updated iPhone to see how it really works in the wild. Instead, these methods are based on trust that Apple is open and honest, that it has written code without significant errors and oversight.

Another option would be to give access to the system to members of Apple’s security research device program to verify the company’s statements. But this team, made up of researchers outside of Apple, is a very limited program that has a lot of rules about what researchers can say or do, because it doesn’t necessarily solve the problem of trust.

This leaves only two options for researchers who want to look at things like this on the iPhone. First, hackers can jailbreak old iPhones using zero-day vulnerability. This is difficult, expensive and can be closed with a security patch.

“Apple has spent a lot of money to keep people from jailbreaking their phones,” Thiel explained. “They’ve specifically hired people from the community specifically to make prison more difficult.”

Or a researcher can use a virtual iPhone that can disable Apple’s security features. In practice, it means Corellium.

There are also limitations that security researchers will be able to observe, but a researcher will be able to detect if the scan is taken outside of photos shared with iCloud.

However, if non-child abuse materials are entered into databases, this would be invisible to researchers. To answer that question, Apple says it will need to separate two child protection organizations in different jurisdictions from the same CSAM image in their databases. But it provided little detail on how this would work, who would manage the databases, which jurisdictions would be involved, and what the final sources of the database would be.

Thiel noted that the material problem of child abuse that Apple is trying to fix is ​​real.

“It’s not a theoretical concern,” Thiel says. “It’s not something that people raise as an excuse to implement surveillance. It’s a real problem that is widespread and needs to be addressed. The solution is not like removing mechanisms of this kind. It’s becoming as impervious as possible to future abuse.”

But, says Corellium’s Tait, Apple is both trying to be blocked and transparent.

“Apple is also trying to take its cake and eat it,” says Tait GCHQ, a former information security specialist at the British intelligence service.

“With their left hand, they make it harder to jail and denounce companies like Corellium to prevent it from existing. Now, with their right hand, they say, ‘Oh, we’ve built this complex system and some don’t seem to trust Apple to be honest, but it’s okay for every security researcher.’ because they can move forward themselves. ”

“I’m sitting here thinking, what do you mean you can only do this? You’ve designed your system so they can’t. The only reason people do things like that is because of you, not because of you.”

Apple did not respond to a request for comment.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button