The National  Memo Logo

Smart. Sharp. Funny. Fearless.

Monday, December 09, 2019 {{ new Date().getDay() }}

@ohhkaygo

Facebook Groups Worldwide Pushing Livestock Medications For Covid-19


Reprinted with permission from Media Matters

Facebook is allowing groups on its platform to promote the use and sale of ivermectin -- a drug typically prescribed to fight parasites in humans and large animals -- to prevent and treat COVID-19, even though the social media company claims that it removes such content as part of its policy against medical misinformation.

In the last week, the Food and Drug Administration and the Centers for Disease Control and Prevention both warned against unapproved use of ivermectin to treat or prevent COVID-19, after increased reports of patients harmed by self-medicating with ivermectin intended for horses. The drug is used to treat intestinal conditions caused by parasitic worms in both animals and people, but the large doses typically prescribed for veterinary use are dangerous for humans.

Despite these warnings -- as well as Facebook's own policy against promoting the purchase, sale, or other misinformation about ivermectin -- users on the platform are sharing ways to use ivermectin for COVID-19, with some even recommending methods for other users to acquire the drug. In fact, Media Matters has found 47 active Facebook groups with nearly 65,000 combined members centered around ivermectin and its use for COVID-19. The majority of these groups were created in 2021, and they're based around the world, including in the United States, South Africa, Malaysia, Australia, and the United Kingdom.

Facebook has taken little action against these groups, despite other reporting on violative content about ivermectin on the platform. At the time of publication, Facebook has taken down one public group, "The People's Medicine: Ivermectin; Safe Effective Economical (S E E)," that had already garnered roughly 17,000 members, and some posts promoting the use of ivermectin have been flagged with a banner warning users that "unapproved COVID-19 treatments may cause serious harm." Upon clicking on the banner, users are redirected to Facebook's COVID-19 Information Center, but they receive no other immediate information on the drug.

Dozens of other ivermectin-focused groups are still active and promoting violative content on Facebook. Group members frequently ask where to acquire a prescription for ivermectin and for information on dosage and drug combinations, and other members point them to fringe outlets such as America's Frontline Doctors or veterinarian supply stores.

In one private group -- IVERMECTIN MD TEAM -- over 27,000 members have access to this harmful misinformation. Facebook has often struggled to properly enforce its policies against COVID-19 misinformation, particularly within private Facebook groups, which can be more difficult for the platform to moderate.

image of facebook post

Other pro-ivermectin Facebook groups are spreading similar misinformation on the platform.

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

image of facebook post

In addition to ivermectin-specific groups, other anti-vaccination and pro-Trump private Facebook groups are also exchanging information on where to buy the drug, how to dose it, and sharing testimonials.

image of facebook post

image of facebook post

The unchecked promotion of yet another unproven treatment for COVID-19 more than a year after the disease first emerged -- particularly given the effectiveness of vaccines developed specifically to fight it -- highlights Facebook's continued failure to protect its users from dangerous medical misinformation in the midst of a deadly pandemic.

Trump PAC Raising Funds On Facebook Despite His Suspension

Reprinted with permission from Media Matters

Facebook's latest policy carve-out for former President Donald Trump, which allows Trump's political action committees to run ads as long they are not "in his voice," has permitted Trump to fundraise and promote his events on the platform, even though he is suspended for at least two years. In return, Facebook has earned at least $10,000 in revenue on these ads.

On June 21, Politico reported that Trump's Save America Joint Fundraising Committee, a joint venture between his Make America Great Again PAC and his newer Save America leadership PAC, had started sponsoring Facebook ads on the Team Trump campaign page. The Team Trump page, which hasn't run any ads since the 2020 election, is also now managed by the Save America Joint Fundraising Committee, and it was run by the Trump campaign as recently as May.

Media Matters analyzed data from Facebook's Ad Library and found that Team Trump has run 258 ads since June 16, spending at least $10,200 and earning at least 1.3 million impressions on ads fundraising off Trump's visit to the border, attacking President Joe Biden, supporting Trump and "the MAGA Movement," or promoting his upcoming rally in Ohio. At time of publication, 37 of the ads are active.

Trump's fundraising committee is running these ads even though Facebook has suspended him from the platform for at least two years, citing his "acts of incitement" in order "to be a deterrent to Mr. Trump and others from committing such severe violations in future." (Responding to Politico's initial reporting about the ads, Facebook spokesperson Andy Stone said, "Groups affiliated with the former president are not barred from posting on Facebook so long as they are not posting in his voice.")

In addition to inciting violence, thousands of Trump's Facebook posts also contained misinformation, warranted an additional information label, or contained harmful rhetoric about others. Facebook allowed Trump to abuse the platform for years, with policy exemptions and weak or ineffective attempts to rein in lies from the former president and his campaign. As but one example, the platform's policyof not fact-checking politicians in ads allowed Facebook to profit from thousands of misleading ads spreading smears and misinformation that Trump ran. And in some cases -- such as with the platform's labeling system -- Facebook's policy may have actually backfired, amplifying Trump's misinformation.

Despite Trump's suspension, his Facebook and Instagram pages remain visible and his old content continues to garner new engagement. Now, this latest policy carve-out allowing "affiliated groups" to run pro-Trump ads as long they are not "in his voice" functionally permits Trump to fundraise on Facebook and promote his events through his network of PACs.

Promoting Trump's Ohio Rally

Since June 16, Team Trump has run at least 119 ads promoting Trump's Ohio rally to be held in July, encouraging people to "get your free tickets now." At time of publication, Facebook has removed 86 of them for violating its advertising policies. (It is unclear which policy they violated.) Trump's fundraising committee spent at least $6,500 and earned more than 490,000 impressions on five different versions of these Ohio rally ads:

Team Trump Facebook ads promoting Trump's Ohio rally_1

Team Trump Facebook ads promoting Trump's Ohio rally_2

Fundraising Off Trump's Border Visit

On June 24, Team Trump started running ads fundraising off Trump's visit to the U.S.-Mexico border. At time of publication, there are at least 10 ads, four of which remain active. Trump's fundraising committee spent less than $300 and earned under 6,000 impressions on three different versions of these ads:

Team Trump Facebook ads fundraising off Trump's visit to the border (1)

Anti-Biden Fundraising

Since June 16, Team Trump has run at least 56 fundraising ads attacking Biden and declaring that "America is in DECLINE." At time of publication, 15 of these ads remain active. Trump's fundraising committee spent at least $900 and earned more than 140,000 impressions on three different versions of these ads:

Team Trump facebook anti-Biden fundraising ads (1)

Pro-Trump Fundraising

Since June 16, Team Trump has run at least 73 fundraising ads in support of Trump and "the America First agenda." At time of publication, 18 of these ads remain active. Trump's fundraising committee spent at least $2,800 and earned more than 745,000 impressions with four different versions of these ads:

Team Trump Facebook pro-Trump fundraising ads

Why Facebook Should Ban Trump Permanently

Reprinted with permission from Media Matters

Facebook has done the bare minimum once again regarding Donald Trump's account, postponing real action for two years. But anything short of a full ban rings hollow. Trump spent years using Facebook to push misinformation and spread extreme rhetoric against his critics -- particularly during his presidency -- and it should be easy for the platform to do the responsible thing and permanently ban his account.

On June 4, Facebook announced plans to suspend Trump for two years, leaving open the possibility of his return "if conditions permit." But if Trump is let back on, he will by all indications continue to abuse the platform to spread misinformation and attack others, as he's done for years. A two-year suspension — just one election cycle — is unlikely to change that.

Media Matters previously reported that roughly 24 percent of Trump's posts between January 1, 2020, and January 6, 2021, contained either misinformation, content warranting an additional information label, or harmful rhetoric about others. Based on his previous habits, we estimate that his two-year suspension will keep at least 2,800 posts with misinformation or extreme rhetoric off the platform, including at least 700 posts that would likely have contained election misinformation.

Facebook has let Trump abuse its platform for years. His 2016 campaign was bolstered on Facebook by fake accounts, and it used data illegally obtained from the platform. And things have not improved since. As our data on his 2020 posts shows, the 24 percent of his 2020 posts that pushed misinformation or extreme rhetoric earned 331.6 million interactions. Facebook's meager attempts to rein in Trump's lies have not been effective, and in some cases -- such as with the platform's labeling system -- they may have backfired.

Facebook's refusal to permanently ban Trump is unsurprising, as he and his extreme rhetoric are good for business. According to Pathmatics data analyzed by Media Matters, the Trump campaign spent roughly $121.5 million on Facebook ads in 2020, earning over 16 billion impressions on these ads. The only companies that spent more on Facebook ads in 2020 were Disney and HBO. And this figure includes only the money the campaign gave directly to Facebook; Trump also directed users to other misinformation spreaders that make Facebook money, putting him at the center of Facebook's lucrative right-wing misinformation ecosystem.

Despite at least one Facebook executive's claims, divisive content garners high engagement, and engagement drives Facebook's profits. The ecosystem propped up by Trump is lucrative, so it is predictable – though still disappointing – that Facebook is hesitant to upset its profit producers.

Trump's Facebook and Instagram pages remain visible, as they have throughout his suspension, and his old content continues to garner new engagement. It's unacceptable that Facebook has left visible content featuring the behaviors (spreading misinformation, inciting violence) that got him kicked off the platform in the first place, and it's exemplary of Facebook's careless content moderation.

Facebook has done everything it could to avoid taking responsibility for Trump's misuse of the platform, and this approach continues in its unwillingness to commit to permanently preventing his return. Now Facebook has all but guaranteed that Trump will make headlines again when his suspension is reevaluated. The cowardly decision is completely antithetical to the platform's stated mission – "to give people the power to build community and bring the world closer together." By failing to ban Trump outright, despite the damage he has done, Facebook has shown its true mission is increasing profit regardless of the costs.

Deceptive Anti-Vax Propaganda Still Proliferating On Facebook

Reprinted with permission from Media Matters

Facebook removed an anti-vaccine group that had already amassed more than 125,000 members, but this move will only minorly inconvenience the group, as members had already set up alternate channels of communication. What's more, this group is only one of over 100 active Facebook groups that contain harmful anti-vaccine misinformation.

On April 22, Facebook removed a large private group dedicated to gathering stories of people allegedly injured by the COVID-19 vaccine. According to the company, the group was removed because it had violated Facebook's harmful misinformation policies. A spokesperson told BBC News, "We allow people to discuss the COVID-19 vaccines but if information could lead to someone being harmed, we remove it."

On its face this seems like a step in the right direction for a platform that has been negligent in its response to dangerous COVID-19 misinformation. However, the group amassed more than 125,000 followers before it was taken down, and they are already successfully reorganizing, setting up a Telegram group and a new social media platform.

The removed group, COVID 19 VACCINE VICTIMS AND FAMILIES, was created on March 29, 2021, and its "About" section stated, "The idea of this group is for victims families to unite and for the victims stories to be heard so we can get justice." The group grew rapidly, particularly in the four days before removal, when it gained an average of over 12,500 members per day. Members would share anecdotes about friends and relatives receiving the COVID-19 vaccine and falling ill, claiming the vaccine was the cause.

Before the Facebook takedown, group members began expressing concerns about possible removal and preparing to organize in other spaces. Invites to a Telegramgroup, set up on April 5, are still prominently displayed on Facebook, with instructions for users to join the channel in order to circumvent Facebook action. On April 18, the group created a new social media platform, modeled after Facebook but explicitly for "vaccine victims." That same day, a Facebook user posted that the group had been "suspended" but said, "I think you can still join the fb group and read the stories before it's removed." The group has also launched a new Facebook group, under a pseudonym. In less than a day, the new group gained roughly 1,100 members, and at the time of publishing, it has roughly 4,000 members.

Despite Facebook's action against the private group COVID19 VACCINE VICTIMS AND FAMILIES, there is still ample anti-vaccine misinformation on the platform. Media Matters has identified 117 additional anti-vaccine Facebook groups that are still active on the platform. The roughly 275,000 members of these groups are exposed to harmful anti-vaccine content, and as nearly 80% of these groups are private, it is more difficult for Facebook to moderate them.

Of these 117 groups, some explicitly call themselves "anti-vaxx" or "anti-vaccine," while others have similar names as the group Facebook removed. Some groups are likely trying to avoid moderation by using more deceptive language, such as "V@xynes" or "V@ccine."

The three biggest groups, with tens of thousands of members each, are plagued with vaccine misinformation, other COVID-19 misinformation, and conspiracy theories.

Vaccine Education Network : Natural Health Anti-Vaxx Community

This private group with roughly 41,800 members promotes misinformation about the COVID-19 vaccine, including false claims that the vaccine will cause serious medical problems:

image of facebook post

image of facebook post

MTHFR Connections: Tongue Ties, Autism, V@xynes, Leaky Gut

This private group with roughly 27,700 members is dedicated to pushing a baseless claim that the MTHFR gene causes a harmful reaction to the vaccine. Members in the group promote this baseless claim and provide each other with medical misinformation. Egregious examples include:

image of facebook post

image of facebook post

image of facebook post

JUST FOR THE HELLTH OF IT

With roughly 18,000 members, this private group promotes misinformation about the COVID-19 vaccine and conspiracy theories, and it even has screenshots of posts from the anti-vaccine group that Facebook removed:

image of facebook post

image of facebook post

image of facebook post

Facebook relies on the idea that things will inevitably slip through the cracks to excuse its weak moderation efforts. Vice president of integrity Guy Rosen ended a March blog post about misinformation by noting that Facebook's "enforcement will never be perfect" and that "nobody can eliminate misinformation from the internet entirely." However, as this example shows, Facebook groups are frequently not relying on subtlety when broadcasting to followers where to find anti-vaccine misinformation — on the platform and off. Anti-vaccine misinformation on Facebook is not buried, and the way anti-vaccine advocates evade moderation is not a secret. Facebook's poor content moderation is inexcusable, and although the removal of one larger group is a good first step, this latest lackluster effort is not a replacement for sufficient moderation.

Research contributions from Carly Evans and Kellie Levine

Twitter Users Promoting Fake 'Vaccination Exemption' Cards Against Platform's Policy

Reprinted with permission from Media Matters

With research contributions by Kellie Levine

So-called "vaccination exemption" cards — which have no legal basis — are being promoted on social media platforms, including Twitter, which just updated its policy against COVID-19 vaccine misinformation. Despite this policy, tweets promoting online stores that sell these cards are still on the platform.

On March 1, Twitter announced that it would be "applying labels to tweets that may contain misleading information about COVID-19 vaccines." Earlier, the platform had implemented a policy against false claims about COVID-19 vaccines in December 2020, saying users may have to remove tweets with false suggestions that vaccines are used for population control, widely debunked claims of alleged adverse effects from the vaccine, and false claims that the vaccine is unnecessary.

Despite Twitter's policies against COVID-19 vaccine misinformation, there are tweets promoting supposed "vaccination exemption" cards, which allegedly allow holders to invoke the right of "informed consent." This term, which typically applies to clinical trials and medical procedures, is often misleadingly used when referring to the requirement for health care providers to give patients information about the benefits and risks associated with vaccinations. These "vaccination exemption" cards have no legal basis, as there is no federal requirement for "informed consent" related to vaccinations and there are currently no COVID-19 vaccine mandates.

In addition to "vaccination exemption" cards, fake "mask exemption" cards have been promoted on Twitter and other social media platforms since mask mandates were initially implemented in April 2020, even prompting the Department of Justice to issue a warning discrediting them. These cards are still being promoted on social media, often alongside "vaccination exemption" cards, even though some posts about mask exemptions have been labeled by Facebook as containing false information. In fact, Facebook earned at least $57,000 on over 130 ads that promoted the cards. Facebook removed the majority of these ads, but they had already earned millions of impressions.

Links to buy these "vaccination exemption" cards have been posted on multiple social media platforms, including Facebook, Instagram, YouTube, and Telegram. Notably, these cards have been promoted on Twitter, sometimes in conjunction with "mask exemption" cards. The cards are still being promoted, despite Twitter's latest policy against COVID-19 vaccine misinformation.

Notable examples include:

image of tweet with alleged "vaccination exemption" cardimage of tweet with alleged "vaccination exemption" cardimage of tweet with alleged "vaccination exemption" cardimage of tweet with alleged "vaccination exemption" cardimage of tweet with alleged "vaccination exemption" card

MyPillow Guy Is Kingpin Of Disinformation On Election and Virus

Reprinted with permission from Media Matters

A new video from MyPillow CEO and Trump supporter Mike Lindell that's filled with election falsehoods is spreading on Facebook, Instagram, Twitter, TikTok, and YouTube, even though each platform has a policy prohibiting this kind of misinformation.

Lindell has been a leading voice in promoting dangerous conspiracy theories about the 2020 presidential election (and bankrolling the proliferation of this lie) across right-wing media and social media.

Twitter permanently suspended Lindell for peddling election misinformation. Lindell then attempted to use his corporate MyPillow account to evade Twitter's ban; that account was also permanently suspended.

Lindell's Facebook and Instagram accounts are both active and full of election and COVID-19 misinformation. In fact, Lindell has access to multiple accounts for himself and his company. On Facebook, he has a personal account, a professional page, and a MyPillow corporate page. On Instagram, he has a verified personal account and a MyPillow account.

Even though former President Donald Trump's multiple attempts to overturn the outcome of the 2020 election failed in courts, over 70% of likely Republican voters question the election results. Meanwhile, his supporters continue to push baseless claims of widespread voter fraud. Lindell is one of Trump's most vocal supporters to promote unsubstantiated election fraud claims and conspiracy theories, and he recently released a film that The New York Times called a "disinfomercial." In the video, titled "Absolute Proof," Lindell spent over two hours falsely claiming that Trump won the election, making wild allegations of fraud that have no basis in reality, and railing against "cancel culture."

Following the release of Lindell's video on February 5, YouTube and Vimeo removed copies for violating each platform's election integrity policies, but additional versions of the film are still being uploaded to YouTube. Facebook and Twitter have both labeled posts sharing the film as misinformation and reduced its distribution, with Facebook confirming that the "video has been rated false by one of Facebook's third-party fact-checkers so it's been labeled and its distribution is being reduced." But Media Matters has still found active posts on Facebook, Instagram, and Twitter that have no label, and TikTok has not taken any action against posts with the video, even though the platform claimed on February 3 that it was taking new steps to crack down on misinformation.

Since before the election, social media platforms have claimed that they are trying to stop the spread of election misinformation, but these platforms have failed to adequately implement or consistently enforce related policies. For example, Facebook took minimal action against election misinformation from Trump and his allies on its platforms, allowing users to organize and promote"Stop the Steal" events, such as the January 6 rally that led to the insurrection at the Capitol on January 6. Media Matters and others have documented similar failures of other platforms, such as Twitter, TikTok, and YouTube.

The limited actions of social media platforms has allowed Lindell's conspiracy-laden video to spread across Facebook, Instagram, Twitter, TikTok, and YouTube.

Facebook and Instagram

Election misinformation policy: We will attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud.

"Absolute Proof" is the latest example of Facebook being incapable or unwilling to consistently enforce its policies. Facebook confirmed that the video violates its policy and has labeled Lindell's posts linking to the film on both Facebook and Instagram as containing false information. But Media Matters has found Facebook and Instagram posts that are not labeled, including posts with links to versions of the video hosted on other websites and alternative platforms, such as Gab and Rumble. These posts are also circulating within private Facebook groups, which have been moredifficult for Facebook to moderate and for researchers and journalists trying to hold Facebook accountable to track.

Notable examples of Instagram posts with Lindell's film include:

Twitter

Election misinformation policy: We will label or remove false or misleading information intended to undermine public confidence in an election or other civic process. This includes but is not limited to: disputed claims that could undermine faith in the process itself, such as unverified information about election rigging, ballot tampering, vote tallying, or certification of election results.

Versions of Lindell's film are also spreading on Twitter. The platform labeled an OAN tweet promoting "Absolute Proof" with a disclaimer: "This claim of election fraud is disputed, and this Tweet can't be replied to, Retweeted, or liked due to a risk of violence."

However, this standard of policy enforcement is not consistently applied to all clips of the video. The Twitter hashtag "#AbsoluteProof" displays tweets containing links to the full-length film, as well as unlabeled video clips.

Right Side Broadcasting Network also tweeted a link to the full film multiple times, but Twitter has not applied a label or restrictions on them.

TikTok

Election misinformation policy: Our Community Guidelines prohibit misinformation that could cause harm to our community or the larger public, including content that misleads people about elections or other civic processes, content distributed by disinformation campaigns, and health misinformation.

Even though it violates TikTok's election misinformation policy, "Absolute Proof" is swiftly spreading on the platform. The "Absolute Proof" hashtag on TikTok already has nearly half a million views, and all of the top videos promote Lindell's video.

Some TikTok creators are directing users to external websites to view the film in its entirety while others are uploading it in sections.

"Look what I got. … Apparently they've been taking down this documentary, so I figured I'd snag it," said one user. "I'll post some goodies that I find. And yeah, take that, big tech." This video has over 190,000 views and the account has over 57,000 followers.

YouTube

Election misinformation policy: Don't post content on YouTube if it fits any of the descriptions noted below.
Presidential Election Integrity: Content that advances false claims that widespread fraud, errors, or glitches changed the outcome of any past U.S. presidential election (Note: this applies to elections in the United States only). For the U.S. 2020 presidential election, this applies to content uploaded on or after December 9, 2020.

YouTube removed Lindell's video for violating its policies, but at the time of publication, there are many additional uploads still on YouTube. An advanced Google search for YouTube videos using the phrase "watch absolute proof" uploaded between February 5 and February 8 returned over 270 results.

There also appears to be a coordinated YouTube spam campaign centered around the Lindell film. All of the top results featured a series of screenshots from the film with overlaid text instructing users to click the "link" below to watch. The text slightly varied with each video, but the format and messaging appear uniform. These videos each have thousands of views.

Facebook Employees Cite Fresh Evidence Of Company's Pro-Conservative Bias

Reprinted with permission from MediaMatters

A new BuzzFeed report reveals that Facebook employees have evidence that shows the platform gives preferential treatment to right-wing Facebook pages, which is at stark odds with conservatives' frequent and unsubstantiated claims that social media platforms are censoring right-wing accounts.

Read Now Show less

Anti-Vaccine Network Pushes Pandemic Conspiracies And Lies On Facebook

Reprinted with permission from MediaMatters

An anti-vaccine Facebook group and network of 17 affiliated state-specific groups have been using the social media platform to spread coronavirus conspiracy theories and misinformation, including a viral video falsely claiming that wearing masks could increase chances of getting coronavirus.

United States for Medical Freedom is a Facebook group with over 28,000 members that uses seemingly benign language to obfuscate its anti-vaccine message, claiming that its goal is to fight for "Medical Freedom & Autonomy." Since the group was created in September, members and the administrators of United States for Medical Freedom have frequently posted about opposition to vaccines, including misinformation about vaccines and calls to action against vaccination policies. One of its administrators claimed she testified on behalf of the group against Massachusetts bills regulating vaccinations necessary to enroll in school.

Read Now Show less

Right-Wing Outlets Use McCabe Case To Urge Stone Pardon

Reprinted with permission from MediaMatters.

Right-wing media immediately seized on the announcement that former deputy FBI Director Andrew McCabe will not face criminal charges to generate outrage and push for President Donald Trump to pardon his longtime adviser Roger Stone.

Prosecutors announced that they are not pursuing criminal charges against McCabe in a letter on February 14. The investigation began in 2018 after a referral from the Justice Department inspector general, Michael Horowitz, who alleged that McCabe misled investigators about a media leak. The new announcement from prosecutors indicates that the case against McCabe has been closed.

Right-wing media have called for months for Trump to pardon his longtime Trump confidant Roger Stone, who was convicted in federal court on seven charges, including lying to Congress and witness tampering. In particular, right-wing media have ramped up calls for Trump to pardon Stone as the Department of Justice and Stone’s attorneys submitted their recommendations to the court for his sentencing, which will be on February 20. Trump has already granted clemency or pardons to controversial right-wing figures during his term, including Joe Arpaio and Dinesh D’Souza, and the president said on February 12 that he won’t rule out a pardon for Stone. 

With the announcement that McCabe will not face criminal charges, right-wing media immediately used it as an opportunity to inaccurately compare him to Stone and to say that Trump needs to pardon Stone. In many cases, they also called for Trump to pardon former national security adviser Michael Flynn, who pleaded guilty to lying to the FBI in December 2017.

Claiming that DOJ’s treatment of these cases is hypocritical is a bad-faith argument being used to downplay Stone’s crimes and create faux outrage — common tactics of right-wing media. Here are some of the most egregious examples:

  • Human Events publisher Will Chamberlain
  • Human Events Managing Editor Ian Miles Cheong
  • Turning Point USA founder Charlie Kirk
  • Turning Point Action’s Ryan Fournier
  • Conservative radio host Buck Sexton
  • Newsmax TV host John Cardillo
  • Fox News contributor Mike Huckabee
  • One America News Network host Liz Wheeler
  • Pro-Trump conspiracy theorist and OANN host Jack Posobiec

Photo Credit: Marc Nozell

Photo Credit: Marc Nozell