Apple blocks automatic CSAM scan on devices after “feedback” from everyone on Earth • The Register

0

Apple said on Friday it intended to delay the introduction of its plan to requisition customers’ own devices to scan their iCloud-linked photos for illegal images of child exploitation, a concession off the backlash that followed the initiative.

“Previously, we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of child sexual abuse material.” , the company said in a statement posted on its site. child safety webpage.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few months to gather feedback and make improvements before releasing these features of child safety of critical importance. “

We have decided to take some extra time over the next few months to collect feedback and make improvements

Apple last month announced its child safety initiative, adding a nudity detection algorithm to its Messages chat client, to provide a way to control explicit image sharing and run code. on customer’s iDevices to detect known child sexual abuse content among device photos destined for iCloud storage.

These features were slated to debut in public versions of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey operating system software, expected later this month or next year. But faced with objections from more than 90 advocacy organizations, Apple chose to put the rollout on hold.

ACLU lawyer Jennifer Granick, via Twitter, called the delay a victory for the defense of civil liberties. “It’s great that Apple is considering engaging with independent privacy and security experts before announcing their genius plans,” she said. noted. “They should start with end-to-end encryption for iCloud backups.”

Matthew Green, associate professor of computer science at the Johns Hopkins Information Security Institute, urged Apple via Twitter to engage with the tech and political communities, and speak to the public, before deploying new technology.

“It’s not a fancy new touch bar,” he said. noted “It’s a compromise on privacy that affects 1 billion users.”

Apple’s stated goal of keeping children safe and preventing the distribution of illegal child sexual abuse (CSAM) material enjoys broad support. But his approach doesn’t. His explicit photo intervention in his Messages app has been described as a more dangerous for children that an advantage.

And its decision to perform CSAM scans using both owner’s computing resources and customer-owned hardware for the program has been widely characterized as an erosion of property rights and a backdoor that will be used for monitoring and control of the program. government.

As NSA whistleblower, Edward Snowden Put the, “Apple plans to erase the line between the devices that work for you and the devices that work for them.”

Apple’s plan also contradicts its own privacy marketing. The Electronic Frontier Foundation, one of dozens of organizations that have expressed concerns about Apple’s plans, highlighted the company’s turnaround, citing text from its CES 2019 billboard: “This what happens on your iPhone stays on your iPhone ”.

“Now that Apple has built [a backdoor], they will come, “wrote EFF deputy executive director Kurt Opsahl in a publication last month. “With good intentions, Apple has paved the way for a mandatory security weakness around the world, allowing and reinforcing the arguments that, if the intentions are good enough, it’s okay to roam your personal life and private communications.”

In a statement sent by email to The register, Evan Greer, director of Fight for the Future, condemned Apple’s “spyPhone” proposal.

“Apple’s plan to analyze photos and messages on the device is one of the most dangerous proposals of any tech company in modern history,” she said. “Technologically, this is equivalent to installing malware on millions of devices, malware that can easily be abused and cause enormous damage.

Apple’s plan to scan photos and messages on the device is one of the most dangerous proposals of any tech company in modern history

Apple – rather than actually engaging with the security community and the public – has released a list of frequently asked questions and answers to address the concern that censor governments will request access to the CSAM analytics system for look for politically objectionable images.

“Could governments force Apple to add non-CSAM images to the hash list?” The company asked in its interview of itself, then replied, “No. Apple would refuse such requests and our system was designed to prevent that from happening.”

Apple, however, has not turned down the Chinese government’s requests for VPNs or censorship. He also did not turn down requests from the Russian government, regarding his 2019 law requiring preinstalled Russian apps.

Tech companies consistently declare that they comply with all local laws. So, if China, Russia, or the United States passed a law requiring device scanning to be tailored to address “national security concerns” or any other plausible cause, Apple’s choice would be to opt out. comply or suffer the consequences – it would no longer be able to say, “We cannot perform on-device analysis”. ®



Source link

Share.

About Author

Leave A Reply