官方APP下载:英语学习播客(支持苹果手机、安卓手机)
创办于2003年
UNSV记不住?那就记中文谐音“忧安思危”吧!
  Slow and Steady Wins the Race!
UNSV英语学习频道 - Slow and steady wins the race!
公众微信服务号(英语全能特训)
英语全能特训(微信公众服务号)
UNSV英语学习频道淘宝网店
客服短信:18913948480
客服邮箱:web@unsv.com
初级VIP会员
全站英语学习资料下载。
¥98元/12个月

Apple's Plan to Search for Child Sexual Images Concerns Privacy Activists

作者:Mario Ritter 发布日期:8-12-2021

Apple recently announced plans to use a tool designed to identify known child sexual images on iPhones.

The decision was praised by child protection groups. But some privacy activists and security researchers have raised concerns. They warn that the system could be misused to search for other kinds of information or be used by governments to watch citizens.

How does it work?

Apple says the tool, called "NeuralHash," will scan all images kept on the device that are sent to iCloud, the company's online storage system. iPhone users can choose in their settings whether to send photos to iCloud or have them remain on the device. If the images are not sent to iCloud, Apple says they will not be scanned by the new tool.

The system searches for photos included in a database of known child sexual abuse images collected by law enforcement. Apple's scanning system will change the images into a "hash." This is a numerical piece of data that can identify the images but cannot be used to recreate them. This hash will be uploaded and compared against the law enforcement image database.

If the system matches an image with one in the database, it will be examined by a human. If the person confirms the image as a match, the device user's account will be locked and the National Center for Missing and Exploited Children (NCMEC) will be contacted.

The system is designed to only identify images already included in the existing database. Apple says parents taking innocent photos of unclothed children need not worry about such images being identified.

Concerns about possible abuse

Some security researchers have criticized the way NeuralHash "sees" the images and say the system could be used for dangerous purposes.

Matthew Green is a top cryptography researcher at Johns Hopkins University. He told the Associated Press that he fears the system could be used to accuse innocent people. It could send users images that seem harmless but that the system would report as child sexual material. Green said researchers have been able to easily fool similar systems in the past.

Another possible abuse could be a government seeking to watch dissidents or protesters. "What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology won't say no."

In an online explanation of its system, Apple said it "will refuse any such (government) demands."

Apple has been under pressure from governments and law enforcement to permit increased observation of data that it encrypts on its devices. The company said its new tool was designed to operate "with user privacy in mind." It also claimed the system was built to reduce the chance of misidentification to one in one trillion each year.

However, some privacy researchers said the system represents a clear change for a company that has been praised for its leadership on privacy and security.

In a joint statement, India McKinney and Erica Portnoy of the Electronic Frontier Foundation warned that Apple's new tool "opens a backdoor to your private life." The two noted that it may be impossible for outside researchers to confirm whether Apple is operating the system as promised.

Apple's system was also criticized by former U.S. National Security Agency contractor Edward Snowden. Snowden lives in exile because he is wanted in the U.S. on spying charges linked to his release of information on secret government programs for gathering intelligence.

He tweeted that with the new tool, Apple was offering "mass surveillance to the entire world." Snowden added: "Make no mistake, if they can scan for kiddie porn today, they can scan for anything tomorrow."

Separately, Apple announced it was adding new tools to warn children and parents when sexually explicit images are received or sent. This system is designed to identify and blur such images and warn children and parents about the content. Apple says the tool will only work for messages in child accounts registered in the company's Family Sharing system.

Apple said the changes will come out later this year with new releases of its device operating systems.

I'm Bryan Lynn.

The Associated Press, Reuters and Apple reported on this story. Gregory Stachel and Bryan Lynn adapted the reports for VOA Learning English. Mario Ritter, Jr. was the editor.

We want to hear from you. Write to us in the Comments section, and visit our Facebook page.

Words in This Story

scan v. to look at (something) carefully usually in order to find someone or something

match n. a person or thing that is equal to another

cryptographyn. the use of special codes to keep information safe in computer networks

encrypt v. to change (information) from one form to another especially to hide its meaning

surveillance n. the act of carefully watching activities of people especially in order to control crime or the spread of disease

porn (pornography)n. movies, pictures, magazines, etc., that show or describe naked people or sex in an open and direct way in order to cause sexual excitement

explicit adj. showing or talking about sex or violence in a very detailed way

blur v. to make (something) unclear or difficult to see or remember

版权所有©2003-2019 南京通享科技有限公司,保留所有权利。未经书面许可,严禁转载本站内容,违者追究法律责任。 互联网经营ICP证:苏B2-20120186
网站备案:苏公网安备 32010202011039号苏ICP备05000269号-1中国工业和信息化部网站备案查询
广播台