@claramartinyy
Far-Right Groups Using Social Media To Plan And Promote US Trucker Convoy

Far-Right Groups Using Social Media To Plan And Promote US Trucker Convoy

As Canada’s capital Ottawa declares a state of emergency in the second week of disruptive protests by trucker drivers opposing the country's vaccine requirements, American truckers — with the assistance of right-wing media and far-right figures — are planning similar protests for Washington, D.C. to protest COVID-19 vaccine mandates. Organizers of the convoy are promoting it across social media, including Facebook groups, Telegram channels, and TikTok accounts, and some have appeared on Fox News and Newsmax to promote “the People’s Convoy.”

The convoy in the U.S. has been planned entirely online, and participants are expected to start from California and head toward D.C. on March 5. This convoy comes amid some Americans’ involvement -- including participation, funding, and organization -- with the Canadian “Freedom Convoy,” as truckers and their supporters have protested at the Canada-U.S. border in Michigan and Montana, shutting down highways and bridges between the two countries.

As the Canadian convoy garnered attention from right-wing media, including on Fox News, American truckers and right-wing figures started organizing similar convoys, mainly on Facebook and Telegram. Facebook user Brian Von D changed the name of his existing group “Save the Flag 2020” to “CONVOY TO DC 2022,” gathering American truckers on social media, including Jeremy Johnson, Mike Landis, and Brian Brase, who have become the “faces” of the convoy. While this group was removed by Facebook for “repeatedly violating our policies around QAnon” and Von D claims he is no longer affiliated with the convoy, the group’s other organizers and spokespeople had gained a sufficiently large following and recognition by then to merge with other similar groups and relaunch across social media platforms as “The People’s Convoy.”

The figures are now organizing on Facebook and TikTok, both of which have been choice platforms for anti-vaccine mandate organizing (despite their COVID-19 and vaccine misinformation policies). They are also using Telegram, which has become a home for right-wing extremists and misinformers. Far-right figures such as Gab CEO Andrew Torba, Stop the Steal organizer Ali Alexander, and a local chapter of the far-right gang Proud Boys have also promoted efforts to organize the convoy.

Fox News Hosted The Organizers Four Times To Promote Their Convoy

In addition to their favorable coverage of the Canadian convoy, right-wing media, especially Fox News, have have amplified efforts to organize the U.S. convoy. Fox has hosted Johnson, Landis, and Brase, with Brase being featured as a guest four times on these Fox News’ shows: Hannity, Fox & Friends, and Tucker Carlson Tonight.


Organizers Are Using Facebook To Gain Support For The Convoy

On January 27, the Facebook group “Save the Flag 2020” changed its name to “CONVOY TO DC 2022” and exploded in membership, garnering over 130,000 members before the group was shut down on February 1. According to Facebook, it removed the page “for repeatedly violating our policies around QAnon.” But Johnson, Brase, and Landis appeared on Fox and laughed off the suggestion that the group’s suspension had something to do with QAnon discussion.

Although Facebook removed the “CONVOY TO DC 2022” group, Media Matters found another 20 groups about the convoy with over 200,000 total members. One of these groups, with almost 24,000 members, is run by Von D, even though he’s claimed he is no longer a part of the convoy.

Organizers Are Using Telegram and TikTok To Promote “The People’s Convoy”

Two main organizers, Brase and Kam Johnston Tonn, have merged their efforts to launch the newly rebranded “The People’s Convoy.” Brase is a trucker with a following on TikTok who, among others, was supposedly brought on to be one of “the faces of the movement” because they are “well-known [in] the trucking industry,” and Tonn has created her own convoy Facebook page and associated Telegram group under the same “The People’s Convoy” brand.

Brase has also used his TikTok account, which has over 25,000 followers, to promote the convoy. This has prompted other users to further disseminate the information about it— adding to the many TikToks in support of the Canadian convoy and Ottawa protests already on the platform.

Telegram, a popular messaging platform notorious for its minimal content moderation policies, has quickly become another venue for organizing the convoy groups. Media Matters found at least 15 Telegram groups and channels promoting American convoys to the capital with a combined 250,000 members, with the largest group containing over 80,000 members.

“The People’s Convoy” follows a similar playbook of organizing on Telegram as used by far-right campaigns over the past year. The group has a centralized national channel and has also promoted state-specific groups chats in an attempt to encourage local organizing and calls to action. This tactic is reminiscent of QAnon influencer-led campaigns to coordinate election “audits” and canvasses, anti-mandate lawsuits and school board protests.

Far-right Fiigures Are Promoting “The People’s Convoy”

The group’s use of Telegram resulted in a variety of far-right users quickly joining the convoy channels due to easy cross-promotion on the platform. Media Matters research has found that several white nationalists, QAnon followers, members of Proud Boys and Three Percenters, and other self-identified militia members have joined many of the popular Telegram convoy group chats.

User profiles of some III%er, QAnon, Proud Boy, militia, and white supremacist members of Telegram convoy groups

Sample of Three Percenter, QAnon, Proud Boy, militia, and white supremacist members of Telegram convoy groups

Far-right influencers outside of Facebook and Telegram have likewise promoted the convoy effort. Conspiracy theorists ranging from Stop the Steal leader Ali Alexander, Arizona state Sen. Wendy Rogers, Students for Trump founder Ryan Fournier, and far-right podcaster Stew Peters have all endorsed the convoy. Real America’s Voice commentator Drew Hernandez directed followers to join a specific Telegram group and far-right influencer Jack Posobiec amplified it by retweeting it to his 1.6 million followers. Gab CEO Andrew Torba, known for his antisemitic remarks and embrace of QAnon users and white nationalists, actively encouraged users to join Gab specific convoy groups on his own fringe social media platform. Posts on QAnon forum GreatAwakening.win and right-wing Twitter alternative GETTR have likewise directed users to specific convoy groups.

Gab newsletter promoting American and Canadian Conovy groups and accounts on Gab

Even former President Donald Trump attempted to capitalize on the Facebook group’s initial suspension to advertise his yet-unreleased social media app, Truth Social, in a statement praising “American Truckers.” But the app will not even be available prior to the convoy dates in March due to recent launch delays.

Reprinted with permission from Media Matters

Covid conspiracies

Why Social Media Is To Blame For The Spread Of Covid Lies

In 2021, social media companies failed to address the problem of dangerous COVID-19 lies and anti-vaccine content spreading on their platforms, despite the significant harm it caused users. Along with enabling this content to spread, some platforms profited from the dangerous misinformation, all while making hollow promises that prioritized positive news coverage over true accountability.

Many platforms instituted toothless moderation policies while letting propaganda encouraging distrust of the vaccine, science, and public health institutions run rampant. Media Matters researchers easily found content promoting dangerous fake cures for COVID-19, conspiracy theories about the virus’s origins and the safety of vaccines, and more on major social media networks throughout the year. Some platforms profited from this content, while others helped anti-vaccine influencers gain followings and monetize their misinformation in other ways.

This abundance of low quality or misleading information was not inevitable. The features that have come to define social media platforms — features that facilitate monetization, promote rapid content sharing, and encourage user engagement — accelerated and fostered misinformation about COVID-19 and the vaccines. And this misinformation has resulted in real and irreversible harms, like patients dying from COVID-19 yet still refusing to believe they have the illness. Social media companies could have taken action to mitigate the issues brought on by their platforms, but they did not, despite repeated warnings and demands for change.

Facebook

Throughout 2021, Facebook (now Meta) repeatedly failed to control the spread of egregious vaccine misinformation and other harmful COVID-19 lies, which were prevalent in the platform’s pages, public and private groups, comments, and ads.

Public pages remained a bastion of anti-vaccine misinformation

As shown in multiple previous Media Matters reports, right-leaning pages on Facebook earn more interactions than ideologically nonaligned or left-leaning pages, despite conservatives’ claims of censorship. In fact, right-leaning pages earned roughly 4.7 billion interactions on their posts between January 1 and September 21, while left-leaning and ideologically nonaligned pages earned about 2 billion and 3 billion, respectively.

In the past year, right-wing pages shared vaccine misinformation with little moderation or consequence from Facebook. Even when the pages were flagged or fact-checked, users found ways around Facebook’s Band-Aid solutions to continue pushing dangerous medical misinformation.

Right-wing figures such as Fox host Tucker Carlson and Pennsylvania state Sen. Doug Mastriano have used the platform to push anti-vaccine talking points and/or lie about the origins of the coronavirus.

In fact, vaccine-related posts from political pages this year were dominated by right-wing content. Right-leaning pages earned a total of over 116 million interactions on vaccine-related posts between January 1 and December 15, accounting for 6 out of the top 10 posts. Posts from right-leaning pages that dominated the vaccine discussions on Facebook included:

Third post in the top 10, with about 300,000 interactions:

A meme from the hodgetwins reading "the protected need to be protected from the unprotected by forcing the unprotected to use the protection that didn't protect the protected"

Fifth post in the top 10, with over 266,000 interactions:

Image of a turning point USA meme reading "food trucks should start parking outside of restaurants that require covid-19 vax cards"

Private and public groups sowed some of the most dangerous discourse

Groups on Facebook were also rife with harmful COVID-19 lies -- including dismissing the severity of COVID-19, promoting dangerous alternative treatments, and sharing baseless claims about the vaccine. In August, Media Matters reported on Facebook groups promoting the use of ivermectin as a prophylactic or treatment for COVID-19, even as government officials warned against it. As of the end of September, there were still 39 active ivermectin groups with over 68,000 members.

Media Matters has repeatedly identified anti-mask, anti-vaccine, and other similar groups dedicated to spreading COVID-19 and vaccine misinformation. Yet Facebook has failed to remove these groups, even though they appear to violate the platform’s policies.

In October, we identified 918 active groups that were dedicated to promoting COVID-19 and vaccine misinformation, with over 2 million combined members. These included groups discussing misleading and false stories of vaccine side effects and conspiracy theories on what is in the vaccine. We also recently identified at least 860 “parental rights” groups dedicated to opposing school policies around LGBTQ rights, sex education, and so-called “critical race theory,” and other culture war issues — including at least 180 groups that promote explicit COVID-19, mask, or vaccine misinformation.

Comment sections continued to be a toxic part of Facebook, especially as users found ways to use them to evade Facebook’s ineffectual fact-checking and moderation efforts. Group administrators encouraged this behavior, asking members to put more extreme content in the comments and to use code words instead of “vaccine” or “COVID” to thwart moderation.

What’s worse, Facebook reportedly knew COVID-19 and vaccine misinformation was spreading in its comment sections and did little to prevent it.

Facebook continued to enjoy increased profits as misinformation spread on its platform

During all of this, Facebook has enjoyed increased profits -- including from ads promoting fringe platforms and pages that push vaccine misinformation. Media Matters found that Facebook was one of the top companies helping COVID-19 misinformation stay in business, and that it was taking a cut itself. Even after a federal complaint was filed against a fake COVID-19 cure circulating on Facebook (and Instagram), the platform -- against its own policy -- let it run rampant, generating profit.

Throughout 2021, Media Matters has followed how Facebook has enabled the spread of harmful COVID-19 lies, extremism, and more.

Instagram

Though often overshadowed by Facebook, Instagram — which is also owned by Meta — has similarly established itself as a conduit for dangerous lies, hate, and misinformation.

In 2021, there was no better example of Instagram’s shortcomings than its inability to stop the spread of anti-vaccine misinformation -- despite Instagram head Adam Mosseri’s persistent claims that the company takes vaccine-related misinformation “off the platform entirely.”

Insufficient moderation and consistent ban evasion by misinformers

In March, Media Matters found that despite Instagram’s ban on anti-vaccine content, anti-vaccine influencers earned tens of thousands of interactions by falsely claiming that the newly available COVID-19 vaccines were “dangerous,” and in some cases by claiming the shot was killing thousands of people.

A month later, Instagram removed several of the anti-vaccine accounts highlighted in our research, including several members of the so-called “Disinformation Dozen,” influencers the Center for Countering Digital Hate identified as the originators of an estimated 65 percent of vaccine misinformation spread on Facebook and Twitter between February 1 and March 16 of this year. Within days of the accounts’ removal, many of them were back on the platform, using ban evasion tactics.

Today you can still find accounts associated with seven members of the Disinformation Dozen and scores of similarly inclined influencers active on the platform. Practically speaking, not much has changed, despite Instagram’s ban on anti-vaccine content.

Instagram’s recommendation algorithm pushes users down anti-vaccine rabbit holes

In addition to allowing violative content to flourish, the platform’s algorithms also push users down anti-vaccine and health misinformation rabbit holes. In October, a Media Matters study found that Instagram’s suggested-content algorithm was actively promoting anti-vaccine accounts to users who demonstrated an interest in such content.

Similarly, the Center for Countering Digital Hate found that Instagram’s “Explore” page not only funneled users toward anti-vaccine posts, but also led them to other extreme content espousing the QAnon conspiracy theory and antisemitism, for instance.

Others who engaged with the platform have stumbled upon the same phenomenon: If a user demonstrates interest in extreme content, the algorithm feeds them more of it.

Instagram’s monetization features present unique dangers

As the company expands its e-commerce ambitions, bad actors are already abusing the platform's monetization features to finance dangerous propaganda. Instagram Shopping, which debuted in 2020, is filled with anti-vaccine merchandise. Pro-Trump businessman Mike Lindell and right-wing agitator Jack Posobiec teamed up to use the platform’s new link sticker feature — which allows users to link directly to external websites — to finance their crusade to undermine faith in American democracy.

Again and again, Instagram commits to addressing harmful content on its platform, but either fails to do so effectively, or waits until it’s way too late.

TikTok

In 2021, TikTok was used as an anti-mask organizing space and a launching pad for COVID-19 and vaccine misinformation. While the policies TikTok designed in response to the pandemic were strong on paper because they specifically addressed combating medical misinformation, the company has failed to meaningfully enforce them.

TikTok fails to proactively moderate dangerous medical misinformation

A large part of TikTok’s misinformation crisis comes from its moderation practices, which appear to be largely reactive. Although the company has removed some COVID-19 misinformation when highlighted by researchers or journalists, it has fundamentally failed to meaningfully preempt, detect, and curb health misinformation narratives before they go viral.

There is no excuse for a multibillion-dollar company behind the most downloaded social media app to have such insufficient moderation practices, especially when medical misinformation can seriously harm its users.

TikTok’s recommendation algorithm fed users COVID-19 and vaccine misinformation

Not only did TikTok fail to stop the spread of dangerous misinformation, but the company’s own recommendation algorithm also helped propelled COVID-19 and vaccine falsities into virality -- hand-delivering harmful medical misinformation to unsuspecting users.

TikTok’s major appeal is its “For You” page (FYP), a personalized feed of videos individually tailored to each user. When COVID-19 misinformation goes viral, it’s often because TikTok’s algorithm feeds users this content on their FYP. Media Matters identified multiple instances of TikTok's own algorithm amplifying COVID-19 and vaccine misinformation. In our study, 18 videos containing COVID-19 misinformation — which at the time of the study had garnered over 57 million views — were fed to a Media Matters research account's FYP.

TikTok’s unregulated conspiracy theory problem creates a gateway to medical misinformation

The spread of conspiracy theories and misinformative content on TikTok has created a pipeline from other false or harmful content to medical misinformation. Vaccine skepticism is tied to belief in conspiracy theories, which has long proliferated on the platform. Media Matters identified repeated circulation of videos from Infowars, a far-right conspiratorial media outlet, including those in which Infowars founder Alex Jones spreads COVID-19 misinformation.

Media Matters also found evidence of a gateway between conspiracy theory accounts and the spread of COVID-19 misinformation, as well as content promoting other far-right ideologies. In one instance, we followed a flat earth conspiracy theory account and TikTok’s account recommendation algorithm prompted us to follow an account pushing COVID-19 misinformation.

YouTube

In 2020, Media Matters documented YouTube’s repeated failure to enforce its own policies about COVID-19 misinformation. In 2021, the platform continued to allow this type of content to spread, despite its announcement of an expanded medical misinformation policy.

In September, well over a year into the pandemic, YouTube finally updated its policies around vaccine-related misinformation. However, these changes came too late, after videos such as the Planet Lockdown series collected at least 2.7 million views while on the platform. In the months following the policy expansion, YouTube’s enforcement of the new policies proved to also be far too little.

YouTube has failed to enforce its guidelines since early in the pandemic

Prior to the policy updates in September, Media Matters documented YouTube’s failure to sufficiently enforce its existing guidelines around COVID-19 misinformation. For example, the platform allowed right-wing commentator Charlie Kirk to baselessly speculate that 1.2 million people could have died from the COVID-19 vaccine. The platform also failed to remove numerous videos promoting deceptive claims about the use of ivermectin to treat COVID-19. (Since the original publication of the linked article, YouTube has removed three of the videos. The rest remain on the platform.) YouTube also hosted a two-hour live event featuring prominent anti-vaccine figures such as Robert F. Kennedy Jr.

Even after its September 2021 policy expansion, YouTube still fell short

After months of letting anti-vaccine and COVID lies flourish, YouTube announced in a blog post that it was expanding its policies because it was seeing “false claims about the coronavirus vaccines spill over into misinformation about vaccines in general.”

The blog stated that the platform would prohibit “content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease,” and content that “contains misinformation on the substances contained in vaccines.”

After renewing its commitment to combating medical misinformation, the Alphabet Inc.-owned platform enjoyed a wave of mostly positive press. However, less than a week after this announcement was made, Media Matters uncovered numerous instances where enforcement of these new policies was falling short.

Despite banning individual accounts, YouTube allowed prominent anti-vaccine figures featured among the Disinformation Dozen to continue to spread misinformation on the platform. Recently, several of the videos were finally removed, but only after accumulating more than 4.9 million views.

Media Matters also found that YouTube allowed numerous videos promoting ivermectin to remain on the site following the new policy debut, and also permitted advertisements for the drug, some of which promoted it as an antiviral for human use.

Additionally, we identified a YouTube video from right-wing group Project Veritas claiming to show a “whistleblower” exposing harms caused by the COVID-19 vaccine. The video, which provides no real evidence or context, accumulated millions of views despite violating YouTube’s updated guidelines.

In 2021, YouTube has repeatedly failed to enforce its own policies. In addition to hosting ample misinformation about COVID-19 and vaccines, the platform has profited from recruitment videos for a militia that has been linked to violence and election fraud lies. It has allowed right-wing propaganda network PragerU to fundraise while spreading transphobia, and is still falling short on its promise to crack down on QAnon content.

Methodology

Media Matters used the following method to compile and analyze vaccine-related posts from political pages on Facebook:

Using CrowdTangle, Media Matters compiled a list of 1,773 Facebook pages that frequently posted about U.S. politics from January 1 to August 25, 2020.

For an explanation of how we compiled pages and identified them as right-leaning, left-leaning, or ideologically nonaligned, see the methodology here.

The resulting list consisted of 771 right-leaning pages, 497 ideologically nonaligned pages, and 505 left-leaning pages.

Every day, Media Matters also uses Facebook's CrowdTangle tool and this methodology to identify and share the 10 posts with the most interactions from top political and news-related Facebook pages.

Using CrowdTangle, Media Matters compiled all posts for the pages on this list (with the exception of UNICEF – a page that Facebook boosts) that were posted between January 1 and December 15, 2021, and were related to vaccines. We reviewed data for these posts, including total interactions (reactions, comments, and shares).

We defined posts as related to vaccines if they had any of the following terms in the message or in the included link, article headline, or article description: “vaccine,” “anti-vaccine,” “vaxx,” “vaxxed,” “anti-vaxxed,” “Moderna,” “Pfizer,” “against vaccines,” “pro-vaccines,” “support vaccines,” “vax,” “vaxed,” “anti-vax,” “pro-vaccine,” “pro-vaxx,” or “pro-vax.”

Article reprinted with permission from Media Matters

Facebook Groups Worldwide Pushing Livestock Medications For Covid-19

Facebook Groups Worldwide Pushing Livestock Medications For Covid-19


Reprinted with permission from Media Matters

Facebook is allowing groups on its platform to promote the use and sale of ivermectin -- a drug typically prescribed to fight parasites in humans and large animals -- to prevent and treat COVID-19, even though the social media company claims that it removes such content as part of its policy against medical misinformation.

In the last week, the Food and Drug Administration and the Centers for Disease Control and Prevention both warned against unapproved use of ivermectin to treat or prevent COVID-19, after increased reports of patients harmed by self-medicating with ivermectin intended for horses. The drug is used to treat intestinal conditions caused by parasitic worms in both animals and people, but the large doses typically prescribed for veterinary use are dangerous for humans.

Despite these warnings -- as well as Facebook's own policy against promoting the purchase, sale, or other misinformation about ivermectin -- users on the platform are sharing ways to use ivermectin for COVID-19, with some even recommending methods for other users to acquire the drug. In fact, Media Matters has found 47 active Facebook groups with nearly 65,000 combined members centered around ivermectin and its use for COVID-19. The majority of these groups were created in 2021, and they're based around the world, including in the United States, South Africa, Malaysia, Australia, and the United Kingdom.

Facebook has taken little action against these groups, despite other reporting on violative content about ivermectin on the platform. At the time of publication, Facebook has taken down one public group, "The People's Medicine: Ivermectin; Safe Effective Economical (S E E)," that had already garnered roughly 17,000 members, and some posts promoting the use of ivermectin have been flagged with a banner warning users that "unapproved COVID-19 treatments may cause serious harm." Upon clicking on the banner, users are redirected to Facebook's COVID-19 Information Center, but they receive no other immediate information on the drug.

Dozens of other ivermectin-focused groups are still active and promoting violative content on Facebook. Group members frequently ask where to acquire a prescription for ivermectin and for information on dosage and drug combinations, and other members point them to fringe outlets such as America's Frontline Doctors or veterinarian supply stores.

In one private group -- IVERMECTIN MD TEAM -- over 27,000 members have access to this harmful misinformation. Facebook has often struggled to properly enforce its policies against COVID-19 misinformation, particularly within private Facebook groups, which can be more difficult for the platform to moderate.

image of facebook post

Other pro-ivermectin Facebook groups are spreading similar misinformation on the platform.

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

In addition to ivermectin-specific groups, other anti-vaccination and pro-Trump private Facebook groups are also exchanging information on where to buy the drug, how to dose it, and sharing testimonials.

image of facebook post

image of facebook post

The unchecked promotion of yet another unproven treatment for COVID-19 more than a year after the disease first emerged -- particularly given the effectiveness of vaccines developed specifically to fight it -- highlights Facebook's continued failure to protect its users from dangerous medical misinformation in the midst of a deadly pandemic.