Smart. Sharp. Funny. Fearless.

Monday, December 09, 2019 {{ new Date().getDay() }}

Tag:

On Youtube Conspiracists, Quacks, And Crooks Promote Covid-19 Misinformation

Reprinted wtih permission from MediaMaatters

In a March 6 blog post, Google and Alphabet CEO Sundar Pichai laid out the company's plans for responding to the COVID-19 pandemic. Among Pichai's announcements was a commitment to removing YouTube videos containing misinformation about preventing the novel coronavirus. This week, YouTube expanded its efforts even further by announcing that it would begin showing fact checks within search results in the U.S., something it began doing last year in India and Brazil.

On the surface, it looks as though the company is taking big, positive steps to prevent the spread of misinformation during a public health emergency. Unfortunately for YouTube, its efforts haven't kept up with its ambition, and the result contributes to a challenging swirl of conflicting information that the World Health Organization has dubbed an "infodemic."


Conspiracy theory YouTube channels like The Next News Network continue to thrive despite obvious violations of YouTube's efforts to ban COVID-19 misinformation. On April 16, Next News Network posted a video suggesting the novel coronavirus was a "false flag" attack meant to force people into getting "mandatory vaccines" and microchips. It was only after the video received nearly 7 million views that YouTube pulled it. Though it was removed, other channels reuploaded the video, illustrating just how fruitless moderation efforts can be. (The platform eventually pulled dozens of the reuploads, but only after Media Matters reported on them.)

In early April, the Tech Transparency Project published a report that not only were some YouTube videos claiming to have the recipe for a coronavirus "cure," but there were ads running along with those clips, meaning that creators were earning revenue from them. YouTube had previously announced that it was going to approach monetization around the virus with caution to avoid financially supporting misinformation. It wasn't until The Guardian reported on the project's findings that YouTube removed any of the videos, and even then, several remained posted.

One popular coronavirus conspiracy theory involves Microsoft founder Bill Gates. According to the theory, which is based on a misrepresentation of Gates' past remarks, Gates wants to control and depopulate the world using a microchip. This conspiracy theory has been floating around major conspiracy circles for months -- including in a video that was posted to the High Frequency Radio YouTube page on March 17. As of May 1, High Frequency Radio's video remains live on YouTube.

On April 15, BuzzFeed reported on a conspiracy theory-riddled YouTube video titled "Dr. SHIVA Ayyadurai, MIT PhD Crushes Dr. Fauci Exposes Birx, Clintons, Bill Gates, And The W.H.O." Among the video's comments is the recommendation that people treat COVID-19 with vitamin C, something YouTube CEO Susan Wojcicki specifically mentioned as an example of content the site was removing during an April 19 appearance on CNN's Reliable Sources. At the time of BuzzFeed's story, the video, which was posted on April 10, had been viewed more than 6 million times. As of May 1, the video remains live on YouTube and has more than 7.2 million views.

Recently ousted Fox News personalities Diamond & Silk have repeatedly spread outrageous COVID-19 conspiracy theories on YouTube. During a March 29 livestream, they claimed that deaths attributed to COVID-19 were being inflated to make President Donald Trump look bad. During an April 10 livestream, the duo claimed that Gates was using the virus to push for population control and that 5G cell towers were responsible for the virus's spread. On an April 20 livestream, they suggested that the World Health Organization could control the virus with an on/off switch. All three of those videos remain live on YouTube as of May 1.

YouTube has had a robust set of creator guidelines for years, but its record of following through on the enforcement of those policies has been less than stellar. For instance, there's a history of LGBTQ YouTubers having their videos wrongfully hidden, demonetized, or restricted. At the same time, the site had previously determined that slurring Carlos Maza as a "lispy queer" wasn't a violation of its anti-harassment policies (Maza previously worked at Media Matters). Enforcement has been hit-and-miss, even in cases where creators are very clearly in violation of community guidelines.

One thing YouTube has done more recently is shift to more automated content moderation, which comes with both positives and negatives. The company announced the increased reliance on this moderation technique as part of its plan to reduce the number of people who need to travel into a physical office during the pandemic. In a blog post, Google notes that though the company has made large investments into an automated moderation infrastructure for YouTube, "they are not always as accurate or granular in their analysis of content as human reviewers." In other words, some content that violates YouTube's policies will remain live, and some content that doesn't violate the policies will be mistakenly removed.

It's not perfect, but it is a refreshingly honest explanation of how flawed online content moderation can be. Unfortunately, honesty doesn't change the fact that one of the world's biggest and most consequential media platforms contributes to public confusion about such a serious topic. YouTube contributes to what the World Health Organization calls an "infodemic," which the organization describes as "an over-abundance of information -- some accurate and some not -- that makes it hard for people to find trustworthy sources and reliable guidance when they need it."

While it's welcome news that YouTube is taking steps to drive users in the direction of authoritative sources, it doesn't make up for the fact that years of failing to live up to its lofty moderation enforcement goals have left the world confused and struggling to parse accurate and inaccurate information.

Writing for The Guardian, Julia Carrie Wong explains what happens when you search the term "coronavirus" on Google. Rather than ads, product recommendations, or untrustworthy websites that just happen to be optimized for that particular search -- which often come up via Google searches -- you'll receive links to mainstream news outlets, government announcements, and information from trusted nongovernmental organizations. "Algorithms and user-generated content are out; gatekeepers and fact checking are in," she writes. "Silicon Valley has responded to the 'infodemic' with aggressive intervention and an embrace of official sources and traditional media outlets."

Unfortunately, she notes, these steps come far too late to stem the spread of misinformation, even in this specific emergency. An entire culture of conspiratorial thinking has built up under the systems put in place by tech companies, and past reluctance to enforce their own guidelines or grapple with the social consequences of not having adequate guidelines (no site wants to outright say that it welcomes hate speech, threats, or misinformation) has driven the public to a dangerous place. If someone has spent years watching someone like Alex Jones spread conspiracy theories about all matters of things, why would they suddenly brush him off as untrustworthy and instead take advice from a mainstream news outlet?

"[Social media companies have] built this whole ecosystem that is all about engagement, allows viral spread, and hasn't ever put any currency on accuracy," University of Washington professor Carl Bergstrom told the Guardian. "Now all of a sudden we have a serious global crisis, and they want to put some Band-Aids on it. It's better than not acting, but praising them for doing it is like praising Philip Morris for putting filters on cigarettes."

Just as one could argue that the White House failed to adequately prepare for a public health emergency, the same can be said of YouTube. There is a human cost to the commodification of misinformation, and it'll take a lot more than scattershot video takedowns or fact check boxes to make this right.

The platform needs to reinvent itself as a place that incentivizes facts and legitimacy over engagement and other social media metrics. The COVID-19 pandemic was a crisis paired with an infodemic; the next one doesn't have to be.

YouTube Still Profiting From Extremist Huckster Alex Jones

Alex Jones, whose Infowars outlet is largely banned from YouTube, is re-emerging on the platform through appearances he and his staff members are making on hugely popular YouTube shows. Many of these programs are monetized through commercials, and YouTube profits off of them because it shares ad revenue with its broadcasters.

On February 27, Jones appeared for nearly five hours on Joe Rogan’s show The Joe Rogan Experience. Rogan and Jones have known each other since the early 2000s and have appeared on each other’s shows. But earlier this year, they were involved in a dispute over comments Jones made about the Sandy Hook Elementary School mass shooting. (Jones has definitively declared the shooting a hoax multiple times but has attempted to spin those comments to rehab his image in recent years.) It is hard to say whether the dispute was genuine or just a ploy to attract attention, but they made up before the February 27 show and the episode has been viewed over 5 million times so far.

Notably, at the top of the lengthy broadcast, Rogan attacked critics who say he shouldn’t give Jones a platform before assisting Jones in spinning his past comments on Sandy Hook. The broadcast features multiple commercial breaks, meaning that Rogan’s channel — which itself has over 4.6 million subscribers — and YouTube are sharing advertising profits for the video.

Forbes senior contributor Dani Di Placido described Jones’ appearance on Rogan’s show as “little more than another unhinged speech from Jones, who has enough energy to feverishly rant about aliens, artificial intelligence and Hillary Clinton for almost five hours solid.” He also noted that Rogan views Jones as “the guy who always provides a wild conversation, as long as you can tolerate listening to his rapid-fire fantasies.”

Jones’ appearance on Rogan’s show appears to not be a one-off occurrence but rather a new tactic to skirt the varying bans imposed on him and his outlet by major social media platforms, including Facebook, YouTube, and Twitter.

Infowars personality Kaitlin Bennett — aka the “Kent State Gun Girl” — also made a lengthy appearance on the massively popular Impaulsive Podcast on February 25. The channel has nearly 1.3 million subscribers on YouTube, and Logan Paul, the primary personality, has nearly 19 million subscribers on his personal channel.

During the podcast, Paul and his two co-hosts ostensibly sought to debate Bennett, who is known for engagingin ridiculous far-right stunts, on issues related to gun regulation and other topics. Instead, the overall effect of the interview was to normalize her brand of commentary. Paul introduced Bennett by calling her “a very controversial guest, arguably more than myself.” He and his co-hosts then bolstered Bennett’s points at times during their discussion. Show co-host Mike Majlak encouraged Bennett to disassociate herself from Jones, saying Bennett’s “very strong points” are diminished by the association. Toward the end of the video, Paul toldBennett she makes “a lot of valid points” but should consider ways she could be more effective with her message.

Throughout the appearance, Paul and his co-hosts appeared woefully unprepared to debate Bennett on specific claims. At one point, they gave her a veneer of legitimacy after she cited the widely known fact that the Transportation Security Administration (TSA) does not have a very high success rate in confiscating prohibited items during airport security checks; one of the co-hosts fact-checked her and ruled that she was correct in her claim.

Like Jones’ appearance on Rogan’s podcast, Bennett’s appearance on Impaulsive Podcast was monetized.

Even though YouTube has banned Jones’ primary account and many of his related channels, Infowars was able to piggyback on Bennett’s appearance on the platform. YouTube still allows Infowars contributor Millie Weaver to maintain a channel, and she posted a recap of Bennett’s appearance titled “Logan Paul Gets Red Pilled,” referring to a quote from The Matrix that is now mostly used to describe someone being convinced to adopt far-right beliefs. The recap video features Infowars’ watermark and ends with Weaver giving a pitch for Infowars’ website and online supplement store.

Beyond the monetization issue, the forays by Infowars figures back into YouTube show an attempt by Jones to emulate the strategy of other fringe right-wing operations to reach an untapped younger audience. Following her appearance, Bennett appeared on a segment on The Alex Jones Show, and Jones noted that Paul reaches “tens of millions of people,” saying, “It’s kind of the college kids that the tweenies and 13-year-olds look up to.” Jones said he was “glad” Bennett went on the show because it is “important” for Infowars to reach young people. Bennett said that by appearing on the program, she gave Paul’s audience a “perspective on gun rights and the Second Amendment that they probably didn’t think they were ever going to watch. So that’s out there now.”

Jones said that he wants to appear on Paul’s show too: “I would love to invite Logan Paul on the show. I would also love to go on that broadcast because I’d like to be able to speak to my oldest daughter’s audience and tell my daughter, ‘Now, be good, and don’t vape like the other girls.’”

#EndorseThis: Hasan Minhaj, Roasting Trump, Is Tops On YouTube

Trending #1 at the moment on YouTube is this full-length video of Hasan Minhaj’s monologue at the White House Correspondents’ Association dinner on Saturday night . No doubt that will reassure the Daily Show correspondent, who stood before the assembled Washington press corps and vowed to hire Kellyanne Conway to say he killed even if he bombed.

Minhaj, a Muslim immigrant from India, is a talented comic and did his best to entertain a sometimes humorless crowd, which groaned at jokes that you may well consider hilarious. “No one wanted to do this,” he kidded, “ so of course it lands in the hands of an immigrant. Don Rickles died so you wouldn’t ask him to do this gig.”

In keeping with the event’s traditions, he roasted the nation’s top media organizations — and of course he roasted the absent president and that man’s minions at high temperature, inflicting the harshest burn on a certain strategist (“I do not see Steve Bannon here…not see Bannon…not-see Bannon…Nazi Bannon”). And while a lot of Minhaj’s jokes work — and he’s a charming guy — it’s a long speech. If you don’t have much time, you may want to view the Washington Post highlights reel.

The complete version offers many rewards, however, including the Spicer jokes, the Ivanka jokes, the Sessions jokes, the Putin jokes, and the many Trump jokes.

Orlando Nightclub Victims’ Families Sue Twitter, Google, And Facebook

(Reuters) – The families of three men killed at Orlando’s Pulse gay nightclub have sued Twitter Inc, Alphabet Inc’s Google and Facebook Inc in federal court, accusing the companies of providing “material support” to the self-radicalized gunman.

The gunman, 29-year-old Omar Mateen, who killed 49 people and wounded 53 in the deadliest mass shooting in modern U.S. history, pledged allegiance to the Islamic State militant group before police fatally shot him after the June attack, officials said.

The lawsuit was filed on Monday in Detroit federal court by the families of Tevin Crosby, Javier Jorge-Reyes and Juan Ramon Guerrero, who were killed during the massacre.

Similar lawsuits in the past have faced an uphill fight because of strong protections in U.S. federal law for the technology industry.

The three families claim Twitter, Google’s YouTube and Facebook “provided the terrorist group ISIS with accounts they use to spread extremist propaganda, raise funds and attract new recruits.”

The suit alleges the “material support has been instrumental to the rise of ISIS and has enabled it to carry out or cause to be carried out, numerous terrorist attacks.”

Facebook said on Tuesday there is no place on its service for groups that engage in or support terrorism, and that it takes swift action to remove that content when it is reported.

“We are committed to providing a service where people feel safe when using Facebook,” it said in a statement. “We sympathize with the victims and their families.”

Twitter declined to comment. In August, the company said it had suspended 360,000 accounts since mid-2015 for violating policies related to promotion of terrorism.

Representatives of Google could not immediately be reached.

The three companies plus Microsoft Corp said this month they would coordinate more to remove extremist content, sharing digital “fingerprints” with each other.

Technology companies are protected from many lawsuits under Section 230 of the federal Communications Decency Act, which says website operators are not liable for content posted by others.

Monday’s lawsuit claims that the companies create unique content by combining ISIS postings with advertisements to target the viewer. It also says they share revenue with ISIS for its content and profit from ISIS postings through advertising revenue.

The families in the case in Michigan, where one of the victims is from, are seeking damages and for the court to rule that the sites have violated the Anti-Terrorism Act in the United States.

(Reporting by Brendan O’Brien in Milwaukee and David Ingram in New York; Editing by Scott Malone and Andrew Hay)

IMAGE: A person rubs an “#Orlando United” sticker on the sign pole outside Pulse nightclub following the mass shooting in Orlando, Florida, U.S., June 21, 2016. REUTERS/Carlo Allegri