The National  Memo Logo

Smart. Sharp. Funny. Fearless.

Monday, December 09, 2019 {{ new Date().getDay() }}

Tag: facebook

Facebook Lets Russia And China Promote Ukraine ‘Bioweapons’ Lie

On Facebook, Russia, its ally China, and right-wing media outlets and personalities are pushing pro-invasion propaganda and disinformation, including the latest false theory that Russia’s invasion of Ukraine was undertaken partly to target U.S.-linked labs working to create bioweapons.

Facebook and other social media companies have taken some actions to restrict Russia’s ability to spread disinformation about the invasion, which includes blocking Russian state media from advertising on the platform. But Facebook is allowing Russian government accounts and state media accounts to remain on the platform and push propaganda with organic posts, while also allowing Russia’s ally China to monetize this propaganda via ads.

Since Russia invaded Ukraine on February 24, RT — one of Russia’s state-controlled media outlets — has posted at least 12 times pushing the conspiracy theory or related claims, according to data compiled from CrowdTangle. These posts have earned at least 30,000 interactions. Notably, four of the posts are videos that were added on March 8 and 9 falsely claiming U.S. Under Secretary of State for Political Affairs Victoria Nuland confirmed that the U.S. was developing bioweapons in Ukraine. These videos earned nearly 200,000 views and only one of the videos has a label noting that there is “missing context.”

RT_facebook post_20220308

RT_facebook posts_videos_20220308 and 20220309

Beyond the videos, there is only one other post from RT about the labs that is labeled as “missing context.”

RT_facebook post_20220306

The other posts from RT that push the conspiracy theory include:

RT_facebook posts_biolabs conspiracy theory

Russia’s propaganda efforts have been bolstered by its ally China and right-wing media in the U.S. also pushing the conspiracy theory. Media Matters has already reported on Meta, Facebook’s parent company, earning revenue on ads promoting the conspiracy theory, including one that was run by Chinese state-controlled media. Since our initial report, at least two additional ads from Chinese state-controlled media have pushed conspiracy theories about U.S.-Ukraine biolabs.

Since the invasion, right-leaning Facebook pages have posted 336 times about the conspiracy theory and have earned almost 500,000 interactions on these posts. In fact, right-leaning pages account for nearly 60% of posts about the conspiracy theory that were posted by politics and news Facebook pages, and they have earned nearly 60% of total interactions.

Many of these posts are from right-wing media outlets and personalities, including six of the 10 posts with the most interactions. The post from right-leaning pages with the most interactions is Fox host Tucker Carlson’s post with video of him pushing the theory on his show. (Carlson has pushed the theory on his show multiple times.) His Facebook video has over 310,000 views.

Tucker Carlson Tonight_facebook post_20220310

Facebook boasts that it has taken action to curb Russian propaganda about its invasion of Ukraine, but the platform is still allowing the propaganda to thrive, including in ads that even earn revenue for Facebook and the other parties.

Methodology

Using CrowdTangle, Media Matters compiled a list of 1,773 Facebook pages that frequently posted about U.S. politics from January 1 to August 25, 2020.

For an explanation of how we compiled pages and identified them as right-leaning, left-leaning, or ideologically nonaligned, see the methodology here.

The resulting list consisted of 771 right-leaning pages, 497 ideologically nonaligned pages, and 505 left-leaning pages.

Every day, Media Matters also uses Facebook's CrowdTangle tool and this methodology to identify and share the 10 posts with the most interactions from top political and news-related Facebook pages.

Using CrowdTangle, Media Matters compiled all posts for the pages on this list that were posted from February 24 through March 20, 2022 and were related to the Ukraine-US biolabs conspiracy theory. We reviewed data for these posts, including total interactions (reactions, comments, and shares). One post that was a false positive was removed from the final dataset.

We defined posts as related to the Ukraine-U.S. biolabs conspiracy theory if they had any of the following terms in the message or in the included link, article headline, or article description: “Ukraine bio-lab,” “Ukraine bio lab,” “Ukraine biolab,” “Ukrainian bio-lab,” “Ukrainian bio lab,” “Ukrainian biolab,” “bio-lab in Ukraine,” “bio lab in Ukraine,” “biolab in Ukraine,” “Central Reference Laboratory,” “level-3 bio-safety lab,” “Nunn-Lugar Cooperative Threat Reduction program,” “Odessa-based laboratory,” “biowarfare lab,” “Victoria Nuland,” “biological research facilities,” “bio-lab,” “biolab,” “bio lab,” “biolabs,” “bio-labs,” “bio labs,” “biological lab,” “biological labs,” “biological research,” “bio-safety,” “biosafety,” “biowarfare,” “bioweapons,” “bio-warfare,” or “bio-weapons."

Reprinted with permission from Media Matters

Senate Commerce Chair Asks FTC To Probe Facebook's Suspected Deceptions

By David Shepardson

WASHINGTON (Reuters) -The chair of the U.S. Senate Commerce Committee on Wednesday asked a regulator to investigate whether Meta Platforms' Facebook misled its advertising customers and the public about the reach of its advertisements.

In a letter to Federal Trade Commission (FTC) Chair Lina Khan seen by Reuters, Senator Maria Cantwell said "evidence suggests that Facebook may have deceived its advertising customers about its brand safety and advertising metrics" and "may have engaged in deceptive practices."

Meta and the FTC did not immediately comment.

Cantwell added that "public information suggests that Facebook’s potential misrepresentations about brand safety and advertising metrics may be unfair, as well as deceptive."

She said "a thorough investigation by the Commission and other enforcement agencies is paramount, not only because Facebook and its executives may have violated federal law, but because members of the public and businesses are entitled to know the facts regarding Facebook’s conduct."

Cantwell cited a 2020 Senate report that Facebook reportedly controlled approximately 74 percent of the social media market.

In October, Senator Richard Blumenthal said both the Securities and Exchange Commission and the FTC should investigate claims made by a Facebook whistleblower that the company knew its apps were harming the mental health of some young users.

The FTC has filed an antitrust lawsuit against Facebook that urged a court to demand that the company sell two big subsidiaries.

The FTC's case against Facebook represents one of the biggest challenges the government has brought against a tech company in decades, and is being closely watched as Washington aims to tackle Big Tech's extensive market power.

The FTC originally sued Facebook during the Trump administration, and its complaint was rejected by the court. It filed an amended complaint in August that Facebook has asked be tossed out.

(Reporting by David Shepardson and Chris Sanders; editing by Diane Craft and Grant McCool)

Carlson And Fox Go Anti-Vax Wacko With Conspiracy Theorist RFK Jr.

Reprinted with permission from Media Matters

Fox host Tucker Carlson hosted anti-vaxxer and conspiracy theorist Robert F. Kennedy Jr. for a fawning interview on his Fox Nation show Tucker Carlson Today. Kennedy is a well-known figure in anti-vaccine circles and one of the most prominent backers of baseless conspiracy theories attempting to link conditions such as autism to vaccines.

During the November 15 interview, Carlson urged viewers to purchase Kennedy's latest book, a screed accusing Dr. Anthony Fauci of intentionally bungling the pandemic, killing alternatives to the vaccine, and launching an assault on the First Amendment in order to silence critics. Kennedy walked Carlson's audience through a grab bag of his most notorious conspiracy theories, at one point asserting his belief that vaccines had to be one of the "key suspects" behind the rise in cases of autism.


Kennedy suggested that more than 17,000 Americans have died from receiving the COVID-19 vaccine, a claim based on a misuse and misunderstanding of U.S. Vaccine Adverse Event Reporting System data that Carlson has repeated on his prime-time show.


Google, Amazon, Facebook Still Profit From Covid-19 Disinformation Sites


Reprinted with permission from Media Matters

Media Matters found that Google, Amazon, and Facebook own some of the most popular trackers present on a recently published list of websites devoted to COVID-19 misinformation. That means trackers from Facebook, Amazon, and Google are aiding these websites -- whether by placing or running ads, retargeting visitors, and/or providing visitor intelligence and behavior analytics -- and ultimately helping them reach an audience and make money.

In August, the Atlantic Council's Digital Forensics Research Lab (DFR Lab) published a list of websites devoted to pushing COVID-19 misinformation. We used a tool to find the tracking software on these sites, yielding a list of companies including Google, Amazon, and Facebook. While not all trackers Media Matters detected monetize or collect data used to target readers, many of them do.

What Are Trackers?

Trackers are bits of code or script placed on a website that convey information to the site owner or a third party.

These trackers "link information about you from different sites, in order to build a profile, based on your browsing history." This data can then be sold or utilized by different parties to target specific information or products to users, such as with targeted ads or political campaigns. According to Forbes:

The rationale is simple: knowing what you click on and where you go informs ad networks about your needs and desires. When they know what you want, they can place ads in your path for those products or services.

That sounds fairly innocuous, and it can be, but the problem is that at scale — and on the open data market — you now have hundreds of virtual avatars in systems that are not under your control. They're profiles that match you to varying degrees: age, location, ethnicity, interests, and potentially much more personal information.

All trackers ultimately allow an entity to try to influence users, and it's done without their knowledge or consent. Not all trackers specifically collect browser histories and build profiles, but they do help a website reach more people and/or make money. If companies, such as Google, Facebook, and Amazon, ceased to serve these websites with their trackers, the websites would reach far fewer people and possibly disincentivize misinformation-for-profit operations.

Trackers Found On COVID-19 Misinformation Websites

Media Matters analyzed the COVID-19 misinformation websites identified by DFR Lab, using a Ghostery-based tool that detects "predefined fingerprints of known web tracking technologies," and found that the most common trackers used were owned by Google (which also owns DoubleClick), Amazon, Gravatar, Switch Concepts, WordPress Stats, and Facebook. We found 53 distinct trackers on 91 of the 146 websites listed by DFR Lab. Of these, we identified 448 total instances of a tracker appearing on a website, or an average of nearly five per website.

Amazon Associates, which provides a range of services like advertising and data collection that help target the best audience via behavioral analysis, appeared on 11 websites. Gravatar, a membership widget that allows users to engage one another, appeared on 36 websites. Facebook and Twitter widgets, which integrate with the social media platforms, appeared on 19 and five websites, respectively. Google, and the companies it owns, accounted for a whopping 161, or 36% of trackers found.

The top five companies providing trackers to COVID-19 misinformation websites include:

Tracker companiesTrackers detected
Google161
Gravatar44
Switch Concepts27
Facebook25
WordPress Stats
22

Over 50 percent of the total instances of trackers appearing on these websites were instances of ad trackers. Among the 254 instances of ad software we found, Google's company DoubleClick accounted for 86, and Switch Concepts accounted for 27. The Facebook Custom Audience advertising tracker appeared on six websites, which is especially troubling because of the precision with which Facebook may be used to target audiences.

The following table depicts the ad trackers and the number of times we detected them:

Ad trackersTracker count
Google's DoubleClick86
Switch Concepts27
Criteo18
AppNexus12
Amazon Associates11
Rubicon10
PubMatic10
OpenX10
BidSwitch10
Advertising.com10
Facebook Custom Audience6
Twitter Advertising2
Other42

Critically, a tracker's presence on a misinformation website doesn't tell us how much money each ad tracker generates. There may be multiple trackers on a site. Not all will generate equal profit, and some trackers might generate little profit. If we considered what percentage of a misinformation website's profit each vendor pays, we would almost certainly see Google's portion grow.

For example, according to the Global Disinformation Index, an organization that researches how mis- and disinformation are monetized, 77 percent of the profit from ads listed on a group of nearly 500 COVID-19 disinformation websites came from Google or a company owned by Google. This means Google may have paid as much as $19.2 million of the $25 million potentially earned by GDI's list of COVID-19 misinformation sites in 2020. (OpenX and Amazon paid out the second and third largest shares of revenues in the Global Disinformation Index report.) One website in the DFR list simply redirected to an Amazon listing rather than using an Amazon Associates tracker, which is an example of another way mis- and disinformation sites monetize and spread false claims.

Recently, NewsGuard, maker of a media literacy browser extension that provides trust ratings for users browsing the internet, published a special report titled Advertising on Misinformation, which explored how misinformation websites generate a substantial profit selling ad space. The report stated that "$2.6 billion in estimated advertising revenue [are] being sent to publishers of misinformation and disinformation each year by programmatic advertisers, including hundreds of millions in revenue supporting false health claims, anti-vaccine myths, election misinformation, partisan propaganda, and other forms of false news."

Why Exposure To Misinformation Matters

Trackers are one way companies can collect data that may help "target customized audiences, or publics, with strategic messaging across devices, channels, and contexts." A report from the Data & Society Research Institute warned that such data -- the same type collected by trackers -- can help build a digital influence machine, which can "identify and target weak points where groups and individuals are most vulnerable to strategic influence." Companies targeting internet users that also work with mis- and disinformation websites may connect the sites with a more receptive audience. And, while these companies profit, we have sadly learned, people who are exposed to COVID-19 misinformation may die preventable deaths as a result.

Facebook Permits Racist Attacks On Afghan Refugees

Reprinted with permission from Media Matters

As Afghan refugees flee the country following the Taliban takeover, xenophobic narratives are spreading widely on Facebook. Despite the platforms' claim to "prohibit the use of harmful stereotypes" and to protect refugees from "the most severe attacks," racist rhetoric that seemingly violates Facebook's policy is rampant in both public and private groups.

These attacks on Afghan refugees come amid the American military's withdrawal from the country and the Taliban's rapid advance, which has resulted in a humanitarian crisis for more than half a million people displaced from their homes since January. With the United States' final withdrawal from the country completed on August 31, numbers show that "approximately 116,700 people have been airlifted out of Afghanistan" in recent weeks, many of whom allied with the United States over the previous two decades of war.

Now, as the U.S. occupation officially ends, users have taken to Facebook to promote xenophobic conspiracy theories and racist stereotypes about Afghan refugees as potential terrorists bent on harming the U.S. In reality, though much information has not been publicly released, government officials say they are conducting a thorough vetting process of refugees coming into the country from Afghanistan.

Some Facebook posts assert that terrorists will attempt to sneak in alongside Afghans seeking asylum. In "Back Boris," a public group with over 41,000 members, one user wrote, "The Taliban will definitely send some of their supporters to the West posing as refugees. They will fight us in our own country." (This post received over 1,000 reactions and more than 500 comments.) This narrative has also spread to right-wing media including Breitbart, where an article titled "Report: Up to 100 Afghans Seeking Resettlement in U.S. 'Flagged' by Terrorism Watch Lists" has received over 13,000 interactions on Facebook, according to the social media analytics tool CrowdTangle.

The Taliban will definitely send some of their supporters to the West posing as refugees. They will fight us in our own country.racist rhetoric about Afghan refugees

Other Facebook users claimed that Biden "surrendered Afghanistan to terrorists" and that only a small portion of people who were evacuated were U.S. citizens, claiming there was "NO VETTING. How many terrorists will Joe Biden bring to America?" Right-wing outlets like The Federalist have shared similar narratives which then spread on Facebook, with one such article accumulating over 1,700 interactions (reactions, comments, and shares) across both public and private posts on the platform.

Racist rhetoric against Afghan Refugees

Users are also leveraging xenophobic conspiracy theories to promote other misinformedright-wing narratives, especially those surrounding COVID-19 vaccinations and the U.S.- Mexico border policy. And some have even threatened violence, suggesting that users should arm themselves to "defend" their communities against Afghan refugees.

Racist rhetoric about Afghan refugeesArticle about Greece building a wall

Though the platform allows discussion of immigration policies, the consistent attacks in which a whole population of people are smeared with dangerous stereotypes seemingly violate Facebook's hate speech policy, which prohibits attacks based on national origin.

Disregarding its own policies on anti-immigrant rhetoric is not new for the platform, as a 2019 study in the European Journal of Communication found:

In short, commercial platforms such as Facebook provide spaces for xenophobic, racist and nationalistic discourse online, and they shape antagonistic (Farkas et al., 2018) attitudes towards immigrants. Moreover, through their large size, they affect mainstream discourses on immigration and refugees, and contribute to a normalization of previously marginalized types of utterances, attitudes and opinions. Anti-immigration groups and publics on commercial social networking services (SNSs) also seem to amplify xenophobic and racist attitudes among their participants.

Facebook is facing no accountability for the malicious content about Afghan refugees that is circulating on its platform, once again showing the company's failure to stem the spread of misinformation, even in times of crisis.

Facebook Groups Worldwide Pushing Livestock Medications For Covid-19


Reprinted with permission from Media Matters

Facebook is allowing groups on its platform to promote the use and sale of ivermectin -- a drug typically prescribed to fight parasites in humans and large animals -- to prevent and treat COVID-19, even though the social media company claims that it removes such content as part of its policy against medical misinformation.

In the last week, the Food and Drug Administration and the Centers for Disease Control and Prevention both warned against unapproved use of ivermectin to treat or prevent COVID-19, after increased reports of patients harmed by self-medicating with ivermectin intended for horses. The drug is used to treat intestinal conditions caused by parasitic worms in both animals and people, but the large doses typically prescribed for veterinary use are dangerous for humans.

Despite these warnings -- as well as Facebook's own policy against promoting the purchase, sale, or other misinformation about ivermectin -- users on the platform are sharing ways to use ivermectin for COVID-19, with some even recommending methods for other users to acquire the drug. In fact, Media Matters has found 47 active Facebook groups with nearly 65,000 combined members centered around ivermectin and its use for COVID-19. The majority of these groups were created in 2021, and they're based around the world, including in the United States, South Africa, Malaysia, Australia, and the United Kingdom.

Facebook has taken little action against these groups, despite other reporting on violative content about ivermectin on the platform. At the time of publication, Facebook has taken down one public group, "The People's Medicine: Ivermectin; Safe Effective Economical (S E E)," that had already garnered roughly 17,000 members, and some posts promoting the use of ivermectin have been flagged with a banner warning users that "unapproved COVID-19 treatments may cause serious harm." Upon clicking on the banner, users are redirected to Facebook's COVID-19 Information Center, but they receive no other immediate information on the drug.

Dozens of other ivermectin-focused groups are still active and promoting violative content on Facebook. Group members frequently ask where to acquire a prescription for ivermectin and for information on dosage and drug combinations, and other members point them to fringe outlets such as America's Frontline Doctors or veterinarian supply stores.

In one private group -- IVERMECTIN MD TEAM -- over 27,000 members have access to this harmful misinformation. Facebook has often struggled to properly enforce its policies against COVID-19 misinformation, particularly within private Facebook groups, which can be more difficult for the platform to moderate.

image of facebook post

Other pro-ivermectin Facebook groups are spreading similar misinformation on the platform.

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

In addition to ivermectin-specific groups, other anti-vaccination and pro-Trump private Facebook groups are also exchanging information on where to buy the drug, how to dose it, and sharing testimonials.

image of facebook post

image of facebook post

The unchecked promotion of yet another unproven treatment for COVID-19 more than a year after the disease first emerged -- particularly given the effectiveness of vaccines developed specifically to fight it -- highlights Facebook's continued failure to protect its users from dangerous medical misinformation in the midst of a deadly pandemic.

It’s Better To Rebut Masking Opponents With Science, Not Censorship

Like many Americans, I do not like wearing a face mask that hurts my ears, steams up my glasses and makes my bearded face itch. And while I think businesses should be free to require face coverings as a safeguard against COVID-19, I am skeptical of government-imposed mask mandates, especially in K-12 schools.

At the same time, I recognize that my personal peeves and policy preferences are logically distinct from the empirical question of how effective masks are at preventing virus transmission. From the beginning, however, the great American mask debate has been strongly influenced by partisan and ideological commitments, with one side exaggerating the evidence in favor of this precaution and the other side ignoring or downplaying it.

Last September, Robert Redfield, then the director of the Centers for Disease Control and Prevention, described masks as "the most important, powerful public health tool we have," going so far as to say they provided more protection than vaccines would. In a 2020 New York Times op-ed piece, Michigan Gov. Gretchen Whitmer asserted that "wearing a mask has been proven to reduce the chance of spreading COVID-19 by about 70%" — a claim that even the CDC said was not scientifically justified.

The CDC invited skepticism about the value of general mask wearing by dismissing it until April 2020, when the agency suddenly began recommending the practice as an important weapon against the pandemic. Although that memorable reversal supposedly was justified by evolving science, the main concern that the CDC cited — asymptomatic transmission — was a danger that had been recognized for months.

When the CDC changed its advice, research on the effectiveness of face masks in preventing virus transmission was surprisingly sparse and equivocal. Although laboratory experiments supported the commonsensical assumption that almost any barrier to respiratory droplets, including DIY cloth coverings, was better than nothing, randomized controlled trials generally had not confirmed that intuition.

A January 2021 review of the evidence in the "Proceedings of the National Academy of Sciences" journal found "no RCT for the impact of masks on community transmission of any respiratory infection in a pandemic." The article, which also looked at observational studies, said "direct evidence of the efficacy of mask use is supportive, but inconclusive."

The authors then considered "a wider body of evidence," including epidemiological analyses, laboratory studies and information about COVID-19's transmission characteristics. "The preponderance of evidence," they concluded, "indicates that mask wearing reduces transmissibility per contact by reducing transmission of infected respiratory particles in both laboratory and clinical contexts."

In a "science brief" last updated on May 7, the CDC said "experimental and epidemiological data support community masking to reduce the spread of SARS-CoV-2." But it acknowledges that "further research is needed to expand the evidence base for the protective effect of cloth masks."

Where does that leave Americans who are unpersuaded by the existing evidence? Banned from major social media platforms, if they are not careful.

YouTube recently suspended Sen. Rand Paul's account because of a video in which the Kentucky Republican said, "most of the masks that you get over the counter don't work." This statement ran afoul of YouTube's ban on "claims that masks do not play a role in preventing the contraction or transmission of COVID-19," which is similar to policies adopted by Facebook and Twitter.

While conceding that "private companies have the right to ban me if they want to," Paul said he was troubled by the fact that the leading social media platforms, partly in response to government pressure, seem to be insisting that users toe the official line on COVID-19. He has a point.

Paul's criticism of cloth masks was stronger than the science warrants, reflecting a broader tendency on the right to dismiss them as mere talismans without seriously addressing the evidence in their favor. But rational discourse entails rebutting arguments by citing contrary evidence instead of treating them as too dangerous for people to consider.

Jacob Sullum is a senior editor at Reason magazine. Follow him on Twitter: @JacobSullum. To find out more about Jacob Sullum and read features by other Creators Syndicate writers and cartoonists, visit the Creators Syndicate webpage at www.creators.com

New Anti-Vax Disinformation Video Got 30 Million Views On Social Media

Reprinted with permission from Media Matters

A viral video pushing misleading claims about coronavirus vaccines and masks has earned at least 30 million views from uploads directly on mainstream social media platforms. In addition to this extensive view count, the video has also seemingly received millions of Facebook engagements despite these platforms' rules against coronavirus misinformation.

Previously, Facebook claimed that it would remove content from its platform that pushes false claims about vaccines. YouTube has said it prohibits content "about COVID-19 that poses a serious risk of egregious harm" or "contradicts local health authorities' or the World Health Organization's (WHO) medical information about COVID-19." TikTok has said it prohibits "misinformation related to COVID-19, vaccines, and anti-vaccine disinformation," and Twitter has said it prohibits "false or misleading information about COVID-19 which may lead to harm."

Despite those rules, the new video promoting lies about the pandemic and vaccines has already spread extensively on these platforms in just a few days.

The viral video features a man named Dan Stock -- who has said he was at the United States Capitol building during the January 6 insurrection -- speaking in front of an Indiana city's school board, where he makes multiple false claims. Calling himself a "functional family medicine physician," Stock falsely suggested that coronavirus vaccines were not effective, saying, "Why is a vaccine that is supposedly so effective having a breakout in the middle of the summer when respiratory viral syndromes don't do that?" He also falsely claimed, "People who have recovered from COVID-19 infection actually get no benefit from vaccination at all," and inaccurately alleged that masks do not work, saying that "coronavirus and all other respiratory viruses ... are spread by aerosol particles, which are small enough to go through every mask." And rather than vaccines, Stock suggested people use the drug ivermectin to treat COVID-19 -- which the FDA has specifically advised against.

A review by Media Matters found that the video has earned tens of millions of views from direct uploads on Instagram, Facebook, YouTube, Twitter, and TikTok combined.

On Instagram, uploads of the video have earned more than 4.6 million combined views. One upload, from right-wing host Sebastian Gorka, has received more than 3.5 million views alone. (In fact, Gorka's uploads of the clip on Instagram and Twitter appear to have contributed to nearly 30 percent of the known views of native uploads on mainstream social media platforms.) Another Instagram upload has nearly half a million views alone. And "Disinformation Dozen" member Sherri Tenpenny, who is ban evading on the platform, got thousands of views for her upload of the video.

Gorka Instagram Stock video

Uploads have also circulated on Facebook, with copies of the video earning at least 100,000 views. A page called Hancock County Indiana Patriots, which claims to have first uploaded the viral clip, got more than 90,000 views for its upload of the video which was then shared by John Jacob, a Republican member of the Indiana House of Representatives. (Jacob also earned thousands of views for his own upload of the video.)

John Jacob Hancock County Indiana Patriots Facebook Stock video

On YouTube, uploads of the video have earned at least 6.5 million views. One version earned well over 3.6 million views before it was taken down for violating YouTube's community guidelines. Multiple uploads of the video -- including the one with millions of views -- also carried ads, meaning YouTube had profited off of spreading these harmful COVID misinformation claims.

Dan Stock YouTube video ads1

On Twitter, uploads of the video have received more than 5.5 million views. Similar to Instagram shares, most of the Twitter views come from an upload by Gorka which was shared on the platform by Rep. Jim Jordan (R-OH) and The Daily Wire's Candace Owens, among others. Gorka's upload was ultimately blocked from being shared on Twitter, but only after days of remaining active.

Jordan Gorka Twitter Stock video

And on TikTok, one user's upload of the video (divided into two parts) earned roughly 14 million views alone. A member of the major TikTok conservative group Republican Hype House also uploaded the video, getting thousands of views.

TikTok Stock video

That a new coronavirus misinformation video was not just able to go viral but apparently surpass the wide spread of previous COVID conspiracy theory videos suggests that many social media platforms continue to struggle with enforcing their policies against misinformation about vaccines and COVID-19. Similarly, the video's ongoing reach shows that efforts by these platforms to label or take it down are not happening nearly fast enough to contain the spread of such harmful misinformation.

Research contributions from Olivia Little, Camden Carter, Spencer Silva, Nena Beecham, Jeremy Tuthill, Kayla Gogarty & Carly Evans.