The National  Memo Logo

Smart. Sharp. Funny. Fearless.

Monday, December 09, 2019 {{ new Date().getDay() }}

Why Facebook Should Ban Trump Permanently

Reprinted with permission from Media Matters

Facebook has done the bare minimum once again regarding Donald Trump's account, postponing real action for two years. But anything short of a full ban rings hollow. Trump spent years using Facebook to push misinformation and spread extreme rhetoric against his critics -- particularly during his presidency -- and it should be easy for the platform to do the responsible thing and permanently ban his account.

On June 4, Facebook announced plans to suspend Trump for two years, leaving open the possibility of his return "if conditions permit." But if Trump is let back on, he will by all indications continue to abuse the platform to spread misinformation and attack others, as he's done for years. A two-year suspension — just one election cycle — is unlikely to change that.

Media Matters previously reported that roughly 24 percent of Trump's posts between January 1, 2020, and January 6, 2021, contained either misinformation, content warranting an additional information label, or harmful rhetoric about others. Based on his previous habits, we estimate that his two-year suspension will keep at least 2,800 posts with misinformation or extreme rhetoric off the platform, including at least 700 posts that would likely have contained election misinformation.

Facebook has let Trump abuse its platform for years. His 2016 campaign was bolstered on Facebook by fake accounts, and it used data illegally obtained from the platform. And things have not improved since. As our data on his 2020 posts shows, the 24 percent of his 2020 posts that pushed misinformation or extreme rhetoric earned 331.6 million interactions. Facebook's meager attempts to rein in Trump's lies have not been effective, and in some cases -- such as with the platform's labeling system -- they may have backfired.

Facebook's refusal to permanently ban Trump is unsurprising, as he and his extreme rhetoric are good for business. According to Pathmatics data analyzed by Media Matters, the Trump campaign spent roughly $121.5 million on Facebook ads in 2020, earning over 16 billion impressions on these ads. The only companies that spent more on Facebook ads in 2020 were Disney and HBO. And this figure includes only the money the campaign gave directly to Facebook; Trump also directed users to other misinformation spreaders that make Facebook money, putting him at the center of Facebook's lucrative right-wing misinformation ecosystem.

Despite at least one Facebook executive's claims, divisive content garners high engagement, and engagement drives Facebook's profits. The ecosystem propped up by Trump is lucrative, so it is predictable – though still disappointing – that Facebook is hesitant to upset its profit producers.

Trump's Facebook and Instagram pages remain visible, as they have throughout his suspension, and his old content continues to garner new engagement. It's unacceptable that Facebook has left visible content featuring the behaviors (spreading misinformation, inciting violence) that got him kicked off the platform in the first place, and it's exemplary of Facebook's careless content moderation.

Facebook has done everything it could to avoid taking responsibility for Trump's misuse of the platform, and this approach continues in its unwillingness to commit to permanently preventing his return. Now Facebook has all but guaranteed that Trump will make headlines again when his suspension is reevaluated. The cowardly decision is completely antithetical to the platform's stated mission – "to give people the power to build community and bring the world closer together." By failing to ban Trump outright, despite the damage he has done, Facebook has shown its true mission is increasing profit regardless of the costs.

While Facebook Reconsiders Trump Account, He’s Still Promoting Lies

Reprinted with permission from Media Matters

EDITOR'S UPDATE: On Wednesday morning, the Facebook oversight board reaffirmed the social media behemoth's suspension of former President Donald J. Trump, but criticized the "indeterminate suspension without clear standards." The board instructed the company to review the decision within six months while establishing a "proportionate response" to Trump's violations.

Former President Donald Trump has been suspended from Facebook for 118 days — potentially keeping hundreds of misinformative or harmful posts off the platform. Without access to Facebook, Trump has turned to alternate forms of communication to deliver more of his same lies about the election that helped ignite an armed insurrection at the U.S. Capitol on January 6.

But on Wednesday morning, the Facebook Oversight Board will announce its decision on reinstating his account. If the board allows Trump back on the platform, it will likely embolden the former president and give him an even bigger platform to spread these harmful lies.

Trump — who is banned on Twitter as well — has not been silent without his social media accounts, nor has he been remorseful. On Monday morning, Trump published a press release via his Save America PAC that clearly telegraphed the false, divisive, and dangerous rhetoric he would likely amplify and share on Facebook if the board reinstates his account.

The press release reads: "The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!" — a claim Trump reiterated last month on Newsmax, where he called the election "rigged" and "stolen." Two weeks prior, he was on Fox News claiming the "Supreme Court and our courts didn't have the courage to overturn elections that should have been overturned." If his post-election media appearances and statements are any indication, Trump will likely use Facebook to spread the same false messaging about the election if he is allowed back on the platform.

Media Matters previously reported that Trump pushed election misinformation in 363 posts, or six percent of his total posts between January 1, 2020, and January 6, 2021. Based on his previous habits, we estimate that the full duration of his suspension (119 days by tomorrow's decision) kept approximately 463 posts with misinformation or extreme rhetoric off the platform, including roughly 116 posts that would likely contain election misinformation.

Throughout his presidency, Trump used social media to spread dangerous, hateful lies, and social media companies did nothing to stop it. This culminated in the events of January 6, when Trump used his Facebook page to encourage the Capitol rioters, who were spurred on by his months-long barrage of false election fraud claims. Now, Trump's press releases and media interviews could not be any clearer: He is doubling down on the lie that the election was stolen.

If the Facebook Oversight Board allows Trump back on the platform, it will be enabling him to continue the exact same behaviors that got him suspended in the first place -- spreading lies and encouraging violence.

Research contributions from Kayla Gogarty

Deceptive Anti-Vax Propaganda Still Proliferating On Facebook

Reprinted with permission from Media Matters

Facebook removed an anti-vaccine group that had already amassed more than 125,000 members, but this move will only minorly inconvenience the group, as members had already set up alternate channels of communication. What's more, this group is only one of over 100 active Facebook groups that contain harmful anti-vaccine misinformation.

On April 22, Facebook removed a large private group dedicated to gathering stories of people allegedly injured by the COVID-19 vaccine. According to the company, the group was removed because it had violated Facebook's harmful misinformation policies. A spokesperson told BBC News, "We allow people to discuss the COVID-19 vaccines but if information could lead to someone being harmed, we remove it."

On its face this seems like a step in the right direction for a platform that has been negligent in its response to dangerous COVID-19 misinformation. However, the group amassed more than 125,000 followers before it was taken down, and they are already successfully reorganizing, setting up a Telegram group and a new social media platform.

The removed group, COVID 19 VACCINE VICTIMS AND FAMILIES, was created on March 29, 2021, and its "About" section stated, "The idea of this group is for victims families to unite and for the victims stories to be heard so we can get justice." The group grew rapidly, particularly in the four days before removal, when it gained an average of over 12,500 members per day. Members would share anecdotes about friends and relatives receiving the COVID-19 vaccine and falling ill, claiming the vaccine was the cause.

Before the Facebook takedown, group members began expressing concerns about possible removal and preparing to organize in other spaces. Invites to a Telegramgroup, set up on April 5, are still prominently displayed on Facebook, with instructions for users to join the channel in order to circumvent Facebook action. On April 18, the group created a new social media platform, modeled after Facebook but explicitly for "vaccine victims." That same day, a Facebook user posted that the group had been "suspended" but said, "I think you can still join the fb group and read the stories before it's removed." The group has also launched a new Facebook group, under a pseudonym. In less than a day, the new group gained roughly 1,100 members, and at the time of publishing, it has roughly 4,000 members.

Despite Facebook's action against the private group COVID19 VACCINE VICTIMS AND FAMILIES, there is still ample anti-vaccine misinformation on the platform. Media Matters has identified 117 additional anti-vaccine Facebook groups that are still active on the platform. The roughly 275,000 members of these groups are exposed to harmful anti-vaccine content, and as nearly 80% of these groups are private, it is more difficult for Facebook to moderate them.

Of these 117 groups, some explicitly call themselves "anti-vaxx" or "anti-vaccine," while others have similar names as the group Facebook removed. Some groups are likely trying to avoid moderation by using more deceptive language, such as "V@xynes" or "V@ccine."

The three biggest groups, with tens of thousands of members each, are plagued with vaccine misinformation, other COVID-19 misinformation, and conspiracy theories.

Vaccine Education Network : Natural Health Anti-Vaxx Community

This private group with roughly 41,800 members promotes misinformation about the COVID-19 vaccine, including false claims that the vaccine will cause serious medical problems:

image of facebook post

image of facebook post

MTHFR Connections: Tongue Ties, Autism, V@xynes, Leaky Gut

This private group with roughly 27,700 members is dedicated to pushing a baseless claim that the MTHFR gene causes a harmful reaction to the vaccine. Members in the group promote this baseless claim and provide each other with medical misinformation. Egregious examples include:

image of facebook post

image of facebook post

image of facebook post

JUST FOR THE HELLTH OF IT

With roughly 18,000 members, this private group promotes misinformation about the COVID-19 vaccine and conspiracy theories, and it even has screenshots of posts from the anti-vaccine group that Facebook removed:

image of facebook post

image of facebook post

image of facebook post

Facebook relies on the idea that things will inevitably slip through the cracks to excuse its weak moderation efforts. Vice president of integrity Guy Rosen ended a March blog post about misinformation by noting that Facebook's "enforcement will never be perfect" and that "nobody can eliminate misinformation from the internet entirely." However, as this example shows, Facebook groups are frequently not relying on subtlety when broadcasting to followers where to find anti-vaccine misinformation — on the platform and off. Anti-vaccine misinformation on Facebook is not buried, and the way anti-vaccine advocates evade moderation is not a secret. Facebook's poor content moderation is inexcusable, and although the removal of one larger group is a good first step, this latest lackluster effort is not a replacement for sufficient moderation.

Research contributions from Carly Evans and Kellie Levine