用户名/邮箱
登录密码
验证码
看不清?换一张
您好,欢迎访问! [ 登录 | 注册 ]
您的位置:首页 - 最新资讯
Apple's child protection features get delayed after privacy outcry
2021-09-06 00:00:00.0     商业标准报-技术新闻     原网页

       Apple's child protection features, which the company had announced last month, has now been delayed by the tech giant owing to criticism that the changes could diminish user privacy.

       According to The Verge, the outcry was regarding one of the features that would scan users' photos for child sexual abuse material (CSAM). The changes had earlier been scheduled to roll out later this year.

       In a statement to The Verge, Apple said, "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material."

       The statement further added, "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

       Apple's original press release about the changes, which were intended to reduce the proliferation of child sexual abuse material (CSAM), had a similar statement at the top of the page.

       That release detailed three major changes in the works. One change to Search and Siri would point to resources to prevent CSAM if a user searched for information related to it.

       The other two changes came under more significant scrutiny. The first would alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids.

       The second one would have scanned images stored in a user's iCloud Photos for CSAM and report them to Apple moderators, who could then refer the reports to the National Center for Missing and Exploited Children, or NCMEC.

       The company detailed the iCloud Photo scanning system at length to make the case that it didn't weaken user privacy. In short, it scans photos stored in iCloud Photos on a user's iOS device and would assess those photos alongside a database of known CSAM image hashes from NCMEC and other child safety organizations.

       Still, several privacy and security experts heavily criticized Apple for the new system, arguing that it could have created an on-device surveillance system and that it violated the trust users had put in Apple for protecting on-device privacy.

       As per The Verge, in an August 5 statement, the Electronic Frontier Foundation said that the new system, however well-intended, would "break key promises of the messenger's encryption itself and open the door to broader abuses.

       (Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)


标签:综合
关键词: features     Apple     iCloud     on-device     user privacy     statement     NCMEC     photos     child    
滚动新闻