[ad_1]
Following the announcement this week, some experts believe Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted, but the company can still identify child abuse material, pass evidence to law enforcement and suspend the perpetrator, this could partly ease political pressure on Apple executives.
It won’t make it easier all Pressure: Most of the same governments that want Apple to take more action to tackle child abuse also want more action on content related to terrorism and other crimes. But child abuse is a real and serious problem that big tech companies have failed so far.
“Apple’s privacy approach is better than any other that I know of,” says David Forsyth, chair of the computer science department at Urbana-Champaign University in Illinois, who has studied the Apple system. “In my opinion, this system is likely to significantly increase the likelihood that people who own or sell [CSAM] found; this should help protect children. Innocuous users should experience minimal or no privacy loss because visual derivatives are only detected if there are enough matches with CSAM images, and only for images that match known CSAM images. The accuracy of the matching system, combined with the threshold, makes it very unlikely to detect images that are not known CSAM images. ”
What about WhatsApp?
Every major tech company is faced with the terrifying reality of child abuse material on its platform. Nobody has approached this like Apple.
Like iMessage, WhatsApp is a continuous encrypted messaging platform with billions of users. Like any platform of this size, they face a big problem of abuse.
“I read the information that Apple released yesterday and I am concerned,” said WhatsApp chief Will Cathcart. tweeted On Friday. “I think this is the wrong approach and an obstacle to the privacy of people around the world. People asked if we would accept this system for WhatsApp. The answer is no. “
WhatsApp includes reporting capabilities so that anyone can report offensive content to WhatsApp. While the possibilities are far from perfect, WhatsApp reported more than 400,000 cases to NCMEC last year.
“This is an Apple-built and managed surveillance system that can very easily be used to scan private content for what they or the government decide they want to control,” Cathcart said in his tweets. “The countries where iPhones are sold will have different definitions of what is acceptable. Will this system be used in China? What content will they find illegal there, and how will we ever know? How will they handle requests from governments around the world to add other types of content to the crawl list? “
In a briefing with reporters, Apple emphasized that this new scanning technology is currently being released only in the United States. But the company continued to claim that it has a history of fighting for privacy and hopes it will continue to do so. So much of it comes down to trusting Apple.
The company has argued that new systems cannot be easily misappropriated by government actions, and has repeatedly stressed that ditching them is as simple as turning off iCloud backups.
Despite being one of the most popular messaging platforms in the world, iMessage has long been criticized for its lack of the reporting capabilities that are now common on social media. As a result, Apple has historically reported a minority of cases to NCMEC that come from companies like Facebook.
Instead of making that decision, Apple has built something completely different, and the end results are an open and troubling question for privacy hawks. For others, this is a long-awaited radical change.
“Apple’s Extended Child Protection is a game changer,” NCMEC President John Clarke said in a statement. “The reality is that privacy and child protection can coexist.”
High stakes
An optimist would say that enabling full encryption of iCloud accounts while detecting child abuse material is a victory in both the fight against abuse and privacy – and perhaps even as a clever political move that blunts anti-encryption rhetoric by American, European, Indian and Chinese officials.
The realist will worry about what will happen next with the most powerful countries in the world. This is a virtual guarantee that Apple will receive – and probably has already received – calls from capitals as government officials begin to unveil the surveillance capabilities of this scanning technology. Political pressure is one thing, regulation and authoritarian control is another. But this threat is neither new nor system-specific. As a company with quiet but lucrative compromises with China, Apple has a lot of work to do to convince users of its ability to stand up to draconian governments.
All of the above may be true. What’s next will ultimately determine Apple’s new technologies. If the government uses this function as a weapon to expand surveillance, then the company is clearly failing to deliver on its privacy promises.
…
[ad_2]
Source link