Section Branding
Header Content
Facebook, Twitter And TikTok Say Wishing Trump's Death From COVID-19 Is Not Allowed
Primary Content
Updated 5:50 p.m. ET Monday
As reaction to President Trump's positive coronavirus test floods social media, Facebook, Twitter and TikTok have a message to users: Wishing for the president to die is not allowed.
All three tech companies confirmed that such posts will be removed for violating each platform's content policies.
As moderators scramble to pull down posts that express hope that Trump succumbs to the virus, wild speculation, conspiracy theories and other falsehoods about the president and first lady's positive COVID-19 tests have been surging on the platforms, with each of them making dicey calls about what is permissible and what crosses a line.
On Trump death wishes, though, the companies fell in line.
A Facebook spokesperson said posts wishing Trump's death — including comments on the president's pages, and posts that tag him — will be taken down from the social network.
Twitter said tweets that "wish or hope for death, serious bodily harm or fatal disease" against anyone, including the president, will be pulled off the platform. Twitter says such "abusive" behavior can lead to an account being suspended.
A spokesperson for TikTok told NPR that cheering on Trump's death would be in violation of the short-form video app's community guidelines, saying content longing for Trump's demise "would be a violation of our community guidelines and removed if we find that."
Leah McElrath, a left-leaning activist and writer who has been harassed and targeted with threats on Twitter, said she would like to see the same protection applied to regular users like herself.
"We've been asking them to do this for years. Jewish people have been asking. Black people have been asking, Women of every ethnicity have been asking, and they haven't," McElrath said.
Last month, someone made a death threat against McElrath on Twitter. She reported it, but it remained up for a week before Twitter removed it. She has stopped reporting threatening tweets against her to Twitter, frustrated with the platform's inaction.
"So what I started doing was blocking immediately. As soon as someone said something that wasn't in good faith or was overtly abusive, I'd just block them," she said.
McElrath said she has blocked more than 7,000 users. Not just for people saying mean things, but after users wished harm on her.
"Trump is the president. He already has a lot of protection," she said. "The reality is that women like me, and black women before me, have been telling Twitter this is a huge issue and a problem that makes the platform really challenging to stay on."
She joins legions of others, many of them women, who are frustrated with the company's tough stance on posts about Trump.
Rashida Tlaib, a Democratic Congresswoman of Michigan and member of the so-called Squad, a group of progressive Democratic women of color, tweeted that Twitter being on high alert over Trump death threats is "messed up."
"The death threats towards us should have been taking more seriously," Tlaib wrote.
A federal law, Section 230 of the Communications Decency Act, allows tech companies to set their own rules for what is and is not allowed to be posted to their platforms.
UCLA professor Sarah Roberts, who studies online content moderation, said abusive behavior should be policed on the platforms, but she said it is true that enforcement has long been too lax for those who are not in the public spotlight.
"There's been an outpouring of regular people saying, 'I've been seeing death threats, rape threats, doxxing for years,' and there's been a complete lack of action," Roberts said. "It's much easier to go and find instances against a major public figure like Trump and erase it than it is with the everyday degradation against everyone else."
Roberts said she is sympathetic to how difficult decisions about such content can be, since "the social media companies don't have a playbook to go by," adding that, "those deciding what content to remove are often some of the least empowered and lowest on the hierarchy workers who are in real-time dealing with the world's bad behavior."
Officials at Twitter say concerns that content moderation decisions are not evenly applied are being listened to and that the company is committed to making its content policies more equitable.
"It's clear we still have so much to do to make Twitter safer and more inclusive,"said Twitter's Vijaya Gadde in a tweet. "The criticism pushes me and our entire team to be better."
Copyright 2020 NPR. To see more, visit https://www.npr.org.