Apple is urged by policy groups to drop iMessage inspection plans and check for abuse images

More than 90 rights and policy groups around the world published an open letter on Thursday urging Apple to abandon plans to scan children’s messages for nudity and adult phones for images of child sexual abuse.

“While these resources are intended to protect children and reduce the spread of child sexual abuse material, we are concerned that they are used to censor protected speech, threaten the privacy and safety of people around the world and have consequences disastrous for many children,” wrote the groups in the letter, first reported by Reuters.

The biggest campaign to date on a single-company encryption problem was organized by the Center for Democracy and Technology (CDT), a non-profit organization based in the United States.

Some foreign signatories, in particular, are concerned about the impact of the changes on countries with different legal systems, including some that already organize heated disputes over encryption and privacy.

“It’s so disappointing and disturbing that Apple is doing this because they’ve been strong allies in the defense of encryption in the past,” said Sharon Bradford Franklin, co-director of the CDT’s Security and Surveillance Project.

An Apple spokesman said the company addressed privacy and security concerns in a document on Friday that describes why the complex architecture of the scanning software must resist attempts to subvert it.

These signatures included several groups in Brazil, where courts repeatedly blocked Facebook’s WhatsApp for not decrypting messages in criminal investigations, and the senate passed a bill that would require the traceability of messages, which would somehow require tagging your messages. contents. A similar law was passed in India this year.

“Our main concern is the reflection of this mechanism, how it could be extended to other situations and other companies”, said Flavio Wagner, president of the independent Brazil chapter of the Internet Society, who signed it. “This represents a serious weakening of encryption.”

Other signatories were in India, Mexico, Germany, Argentina, Ghana and Tanzania.

Surprised by previous protests after its announcement two weeks ago, Apple offered a series of explanations and documents to argue that the risks of false detections are low.

Apple said it would refuse to expand the image detection system beyond photos of children flagged by clearinghouses in multiple jurisdictions, although it did not say it would exit the market rather than comply with a court order.

While most objections so far have been about device scanning, the coalition letter also misses a change in iMessage on family accounts, which would attempt to identify and blur nudity in children’s messages, allowing them to see only if the parents were notified.

Signatories said the measure could endanger children in intolerant homes or those seeking educational materials. More broadly, they said the change will break end-to-end encryption for iMessage, which Apple has strongly advocated in other contexts.

“Once this backdoor feature is built in, governments can force Apple to extend notification to other accounts and detect images that are objectionable ​​for reasons other than sexually explicit,” the letter says.

Other groups that have signed include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International and the Tor Project.

© Thomson Reuters 2021


.

Leave a Comment