用户名/邮箱
登录密码
验证码
看不清?换一张
您好,欢迎访问! [ 登录 | 注册 ]
您的位置:首页 - 最新资讯
U.S. judge rules Facebook must turn over closed accounts that fed Myanmar genocide
2021-09-24 00:00:00.0     华盛顿邮报-华盛顿特区     原网页

       A federal judge ordered Facebook on Wednesday to turn over data related to accounts it deleted in 2018 that fueled Myanmar’s genocidal crackdown on Rohingya Muslims, issuing one of the first legal rulings establishing U.S. technology giants’ obligations after they remove content on their platforms.

       Support our journalism. Subscribe today ArrowRight

       U.S. Magistrate Judge Zia M. Faruqui of Washington ruled that once providers permanently take down accounts, federal privacy law does not apply to user information. The sweeping opinion is not binding as precedent and can be appealed, but it could have wide impact on investigators seeking content that has been removed by social media services, such as in the Jan. 6 mob attack on the Capitol, analysts said.

       The nation of Gambia asked a U.S. court in June 2020 to compel Facebook to release records related to accounts set up by Myanmar military leaders, police agencies and others that Facebook shut down two years earlier, seeking evidence of “genocidal intent” in a human rights claim brought by the West African country in international court.

       Advertisement

       Story continues below advertisement

       Facebook objected, citing user privacy laws. The judge called that stance “rich with irony” given the company’s own privacy scandals. The judge also challenged the company “to live up to its words” about the need to remediate its role in Myanmar’s violent purge in 2017 of 750,000 Muslim men, women and children in a campaign of rape, murder and razed villages.

       “Locking away the requested content would be throwing away the opportunity to understand how disinformation begat genocide,” Faruqui wrote. “Failing to do so here would compound the tragedy that has befallen the Rohingya.”

       Read the opinion here

       Experts said the decision increases pressure on social media companies and U.S. policymakers to address the viral spread of disinformation on private sites over matters ranging from Russian interference in the 2016 U.S. presidential election to the coronavirus pandemic to what a U.N. mission in Myanmar’s case called a “carefully crafted hate campaign” targeting a religious minority. User privacy cannot be invoked as an excuse to withhold data deleted but preserved by a provider, the decision suggested, or to avert a reckoning for events that caused societal harm.

       Advertisement

       Story continues below advertisement

       With more than 2 billion users, Facebook has faced a firestorm over the weaponization of its site by bad actors, with a Wall Street Journal investigation reporting last week that employees raised alarms about the company’s limited efforts to check misuse it knew about in developing countries, including by human traffickers in the Middle East and Ethiopian armed groups inciting violence. The company has shifted its strategy from public apologies and promises of transparency to aggressively reshaping its image, including pushing pro-Facebook items in its news feed, the New York Times reported on Tuesday.

       Only Facebook knows the extent of its misinformation problem. And it’s not sharing, even with the White House.

       In a statement, Facebook spokeswoman Emily Cain said the company was reviewing the decision.

       “We remain appalled by the atrocities committed against the Rohingya people in Myanmar and support justice for international crimes. We’ve committed to disclose relevant information to authorities, and over the past year we’ve made voluntary, lawful disclosures to the IIMM [the U.N. investigation into Myanmar] and will continue to do so as the case against Myanmar proceeds,” she said in the statement.

       Advertisement

       Story continues below advertisement

       University of California at Berkeley law professor Orin Kerr said Wednesday’s opinion probes key fault lines in divergent opinions by U.S. courts over how privacy laws should be applied to information held by social media and cloud computing service providers. The opinion, Kerr said, highlighted the need for higher courts and Congress to update the 35-year-old Stored Communications Act.

       The law, enacted to establish privacy protections in electronic communications, sets limits on what information communications and cloud computing service providers can disclose about their customers and subscribers.

       Courts now generally hold that the government requires a search warrant to access the content of unopened emails, but they remain split over basic questions such as whether the law’s protections apply to opened emails or backup copies stored by companies.

       Rethinking the privacy laws

       Faruqui’s opinion makes a provocative ruling that the law’s protections stop applying when Internet providers no longer make files available to users by deleting them.

       Advertisement

       Story continues below advertisement

       Facebook has frequently cited user privacy as a reason to not share data that would be relevant to the public interest. In 2017, it was initially reluctant to share information with congressional investigators looking into a Russian disinformation campaign on the platform because it said that doing so would compromise user privacy. Eventually the company shared some of that data in response to public and congressional pressure.

       This summer, it cut off New York University researchers from access to data that would have enabled them to study the role Facebook pages played in promoting the Jan. 6 attack on the Capitol.

       The company said allowing the NYU researchers to scrape data by using a widget that was downloaded by volunteers would compromise its 2019 privacy settlement with the Federal Trade Commission. The agency dismissed Facebook’s arguments in a rare response, saying that Facebook’s interpretations of its privacy obligations under the settlement were “inaccurate” and that the agency hoped Facebook was not using privacy as a “pretext to advance other aims.”

       Advertisement

       Story continues below advertisement

       Kerr said his initial reaction is that the judge is wrong that de-platformed but preserved records don’t qualify as protected “backup copies,” but he said the decision adds urgency for the Supreme Court to weigh in on a law that it has never interpreted.

       “If someone’s personal messages are now open for inspection because a tech company deleted their account, then it’s a big hole in the statute: An Internet provider can just delete the account and release all of someone’s private messages. I would have thought those messages were still protected,” Kerr said. “The U.S. Supreme Court really needs to start explaining what this statute means, because lower courts are dividing over how to interpret the law, and you just need some certainty.”

       Facebook blocks accounts of Myanmar’s top general, other military leaders

       Facebook in August 2018 began deleting and banning accounts of key individuals and organizations in Myanmar, acknowledging that its platform was used to “foment division and incite offline violence” that a U.N. mission found colossal in scale.

       Advertisement

       Story continues below advertisement

       A company review concluded that in a country where effectively “Facebook is the Internet,” nearly 12 million people followed pages, groups and accounts that engaged in “coordinated inauthentic behavior” and were set up by users including the commander in chief of Myanmar’s armed forces, the military’s television network and “independent” sources surreptitiously controlled by government officials.

       Using fake accounts and news pages, groups spread hate speech casting Rohingya, Muslims and their defenders as illegal immigrants, terrorists and an existential threat to Myanmar and to Buddhism, leading to mob violence, a U.N. investigation concluded.

       Facebook preserved the content it deleted, sponsored an independent internal review, and said in an official statement by a company representative: “We know we need to do more to ensure we are a force for good in Myanmar.” But it opposed Gambia’s request last year to cooperate with the International Court of Justice probe, calling lawyers’ request overbroad, unduly intrusive and a step that would yield “special and unbounded access” to user accounts.

       This tiny African country got the U.N.’s top court to investigate Myanmar for genocide

       Gambia, represented by the Miller & Chevalier law firm, asked for documents about how Facebook identified the content for deletion, and public and private communications associated with the deleted content.

       Advertisement

       Story continues below advertisement

       Africa’s smallest country plays an outsize role in international affairs. The majority-Muslim nation’s claim against Myanmar was backed by the 57-member Organisation of Islamic Cooperation. Its justice minister, who helped investigate neighboring Rwanda’s 1994 genocide, has said he was motivated by visiting Rohingya refugee camps.

       In granting the request, Faruqui quoted Facebook’s own review calling content moderation “one of the most pressing challenges of our time” and its advertisements supporting updated Internet regulations.

       “Facebook can act now,” he wrote in a 32-page opinion. “A surgeon that excises a tumor does not merely throw it in the trash. She seeks a pathology report to identify the disease.”

       Story continues below advertisement

       Facebook argued that under the interpretation of the law accepted by the court, any time a provider deactivates a user’s account for any reason, a user’s communications would become available for disclosure to anyone.

       Advertisement

       The company said it preserved data as recommended by the review it commissioned, but that civil litigation was not the appropriate channel to pursue information. The company has worked with U.N. investigators on Myanmar, calling it a way to voluntarily assist international accountability efforts, including Gambia’s case at the ICJ.

       Facebook will pay $5 billion fine and submit to historic government settlement for privacy violations

       But the court concluded that the law already permits providers to make unilateral determinations about “disclos[ing] records, information, and contents of accounts”; that law enforcement can already access content using search warrants; and that the narrow category of requested content raised minimal concerns.

       Users can permanently delete their own communications, making them generally unrecoverable, or join sites that do not moderate content, the judge noted. Faruqui added that the opinion does not apply to content “in purgatory” — de-platformed by a provider but not yet subject to a final decision about permanent deletion.

       Human rights groups say Congress should require social media companies to share relevant information with official bodies and litigants in international courts to deter state actors from engaging in criminal conduct, including genocide and mass atrocity crimes.

       If a House committee investigating the Jan. 6 attack is interested in the case, it won’t have to go far for legal background. One of Gambia’s Washington-based attorneys, Marcus Childress, joined the committee staff effective Monday.

       Your privacy is the price of Facebook’s monopoly

       Misinformation on Facebook beats factual news when it comes to clicks, study finds.

       Facebook keeps researching its own harms — and burying the findings

       


标签:综合
关键词: advertisement     Rohingya Muslims     privacy     company     accounts     Facebook     content    
滚动新闻