You are currently viewing Apple says researchers can test its child safety features.  He is suing a startup that does just that.

Apple says researchers can test its child safety features. He is suing a startup that does just that.

[ad_1]

Apple may submit a code for verification, although, according to it, this is not what it will do. Researchers may also try to reverse engineer this function in a “static” way, that is, without running real programs in a real environment.

In fact, however, all of these options have at least one problem in common: they prevent you from looking at the code running in real time on a refreshed iPhone to see how it actually works in the wild. Instead, these methods continue to rely not only on Apple’s openness and honesty, but also on the fact that it wrote the code without any significant errors or omissions.

Another option is to make the system available to members of Apple’s Security Device Research Program to verify the company’s claims. But this group of researchers outside of Apple is a highly exclusive, limited program with so many rules about what researchers can say or do that doesn’t necessarily solve the trust issue.

That leaves really only two options for researchers looking to look into the iPhone for this sort of thing. First, hackers can jailbreak older iPhones using a zero-day vulnerability. It is tricky, expensive, and can be closed with a security patch.

“Apple has spent a lot of money trying to keep people from hacking their phones,” Thiel explains. “They specifically hired people from the hacking community to make hacking more difficult.”

Or a researcher can use a virtual iPhone that can disable Apple’s security features. In practice, this means Corellium.

There are also restrictions on what any security researcher can observe, but the researcher can determine if the scan goes beyond the photos sent to iCloud.

However, if non-child abuse material gets into databases, researchers will not notice. Apple says that addressing this issue will require two separate child protection organizations in different jurisdictions to have the same CSAM image in their databases. But it had few details on how it would work, who would manage the databases, which jurisdictions would be involved, and what the ultimate sources of the database would be.

Thiel points out that the material problem of child abuse that Apple is trying to solve is real.

“This is not a theoretical problem,” says Thiel. “This is not something that people only use as an excuse to carry out surveillance. This is an urgent problem that is widespread and needs to be addressed. The solution is not like getting rid of such mechanisms. This makes them as impervious to future abuse as possible. “

But, according to Corellium’s Tate, Apple is trying to be both closed and transparent at the same time.

“Apple is trying to get its pie and eat it,” said Tate, a former information security specialist for the UK intelligence service GCHQ.

“With their left hand, they make it harder to hack and sue companies like Corellium to stop them from doing so. Now, with their right hand, they say, “Oh, we’ve built this really complex system, and it turns out that some people don’t believe Apple did it honestly – but that’s okay, because any security researcher can go further and prove it. themselves.'”

“I sit here and I think you mean you can just do it? You’ve designed your system in a way that they can’t. The only reason people can do these things is in spite of you, not because of you. “

Apple did not respond to a request for comment.

[ad_2]

Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.