Section Branding
Header Content
'Stop Lying': Muslim Rights Group Sues Facebook Over Claims It Removes Hate Groups
Primary Content
For eight years, the civil rights organization Muslim Advocates has reported scores of examples of bigotry and hate promoted across Facebook. It has published reports and met privately with CEO Mark Zuckerberg and his number two, Sheryl Sandberg, about its concerns.
Frustrated with what it sees as a lack of progress, Muslim Advocates on Thursday filed a consumer protection lawsuit against Facebook, Zuckerberg and Sandberg, among other executives, demanding the social network start taking anti-Muslim activity more seriously.
The suit alleges that statements made by the executives about the removal of hateful and violent content have misled people into believing that Facebook is doing more than it actually is to combat anti-Muslim bigotry on the world's largest social network.
"What we're saying in the lawsuit to Facebook is, 'Do one of two things: Stop lying, or have your actions conform to your statements,'" Muslim Advocates lawyer Mary Bauer said.
The suit cites research from Elon University professor Megan Squire, who found that anti-Muslim bias serves "as a common denominator among hate groups around the world" on Facebook.
Squire, in 2018, alerted the company to more than 200 anti-Muslim groups on its platform. According to the suit, half of them remain active.
A year earlier, Muslim Advocates provided Facebook a list of 26 anti-Muslim hate groups. Nineteen of them remain active today, according to the suit. Those include groups with obvious names like "Anti-Islam Movement," "Purge Worldwide," and "Islam is Pure Evil."
None of the groups have many followers; the largest ones cited have several thousand and others appear to be inactive. But Muslim Advocates say Facebook should have shut them down long ago.
Lawyers for Muslim Advocates say Facebook's passivity flies in the face of statements Zuckerberg has made to Congress that if something runs afoul of Facebook's rules, the company will remove it.
"We do not allow hate groups on Facebook overall. So if there is a group that their primary purpose or a large part of what they do is spreading hate, we will ban them from the platform overall," Zuckerberg told Congress in 2018.
Facebook's Community Standards ban hate speech, violent and graphic content and "dangerous individuals and organizations," like an organized hate group.
In its lawsuit, Muslim Advocates says that is not the case, arguing that there is a "systemic problem of anti-Muslim hate" on Facebook.
"This is not, 'Oh a couple of things are falling through the cracks,'" Bauer said. "This is pervasive content that persists despite academics pointing it out, nonprofits pointing it out. Facebook has made a decision to not take this material down."
Facebook says in the last quarter of 2020, it took action on 6.4 million instances of organized hate and nearly 27 million pieces of hate speech content.
A Facebook spokesman said the company does not allow anti-Muslim hate on the platform.
"We do not allow hate speech on Facebook and regularly work with experts, non-profits, and stakeholders to help make sure Facebook is a safe place for everyone, recognizing anti-Muslim rhetoric can take different forms," the Facebook spokesman said. "We have invested in AI technologies to take down hate speech, and we proactively detect 97 percent of what we remove."
The lawsuit is asking a judge to declare the statements made by Facebook executives about its content moderation policies fraudulent misrepresentations.
It seeks an order preventing Facebook officials from making such remarks. Muslim Advocates also asks the court to make Facebook post "corrective advertising" about its content-policing practices across the platform and pay monetary damages.
"A corporation is not entitled to exaggerate or misrepresent the safety of a product to drive up sales," Muslim Advocates lawyer Bauer said. "They have made false statements to placate civil society groups and to convince Congress that there's no need for regulation because they've dealt with their problem."
Since 2013, officials from Muslim Advocates have met with Facebook leadership, including Zuckerberg, "to educate them about the dangers of allowing anti-Muslim content to flourish on the platform," the suit says. But in the group's view, Facebook never lived up to its promises. Had the company done so, the group alleges in the lawsuit, "it would have significantly reduced the extent to which its platform encouraged and enabled anti-Muslim violence."
Julia DeCook, a professor at Loyola University Chicago who studies online extremism, said she is not surprised Facebook did not take a harder approach against the content, noting that posts that strike a nerve tend to keep people on social media, boosting the companies' bottom lines.
"The platforms have no basic monetary motivation to actually implement these changes because, frankly speaking, hate speech is profitable for them," DeCook said. "So it's more about image management and actually living up to these values that they claim to espouse."
The suit also names Facebook public policy executives Joel Kaplan and Kevin Martin as defendants for allegedly aiding and abetting misleading statements in helping Zuckerberg and Sandberg prepare remarks to Congress.
Anti-Muslim events planned on Facebook
Muslim Advocates tried over the years to draw Facebook's attention to anti-Muslim events, in particular, that were being organized on the platform.
In the lawsuit, the group says it told Facebook that a militia group, the Texas Patriot Network, was using the platform to organize an armed protest at a Muslim convention in Houston in 2019. It took Facebook 24 hours to take the event down. The Texas Patriot Network is still active on the social network.
The suit also referenced an August 2020 event in Milwaukee, Wis. People gathered in front of a mosque and yelled hateful, threatening slurs against Muslims. It was broadcast live on Facebook. The video was removed days later after Muslims Advocates alerted Facebook to the content.
The suit points to reporting by BuzzFeed that showed that an event connected to the group Kenosha Guard had been flagged to Facebook more than 400 times, but it was never removed by the company. An Illinois teenager was arrested for fatally shooting two protesters in Kenosha, Wis., after a Facebook group encouraged attendees to "take up arms."
Facebook published a civil rights audit in July 2020 examining hate speech on the social network, including bias and violence directed at Muslims. It pointed to the Christchurch mass shooting in New Zealand, which left 51 people dead. The shooter live-streamed the massacre on Facebook.
"Civil rights advocates have expressed alarm," the outside auditors wrote. "That Muslims feel under siege on Facebook."
Editor's note: Facebook is among NPR's financial supporters.
Copyright 2021 NPR. To see more, visit https://www.npr.org.