Reprinted with permission from AlterNet.
A parade of Republicans warned Facebook to stop removing right-wing content on Wednesday, accusing it of censorship and holding the cudgel of federal regulation over its head, in the second day of congressional testimony by CEO Mark Zuckerberg.
“There’s an issue of content discrimination,” said Rep. Cathy McMorris Rodgers, R-Washington. “It is not a problem unique to Facebook. There’s a number of high-profile examples of edge providers engaging in blocking and censoring religious and conservative political content… What is Facebook doing to ensure that its users are treated fairly and objectively by content reviewers?”
“It’s been said many time here that you refer to Facebook as a platform for all ideas. I know you’ve heard from many yesterday and today about concerns regarding Facebook’s censorship of content, particularly content that may promote Christian beliefs or conservative political beliefs,” said Rep. Richard Hudson, R-North Carolina. “I hope this has become very apparent to you that this is a very serious concern… My question to you is what is the standard that Facebook uses to determine what is offensive or controversial and how’s that standard been applied across Facebook’s platform?”
The GOP attack on Facebook for favoring liberal content over conservative content revives an old culture war. In the mid-1990s, then House Speaker, Georgia Republican Newt Gingrich, attacked public radio and television of also having a liberal bias—causing NPR to include more conservative comments and viewpoints.
What’s different today, with social media in general and Facebook in particular, is that high tech’s most commonly used platforms are private. They aren’t partly taxpayer-funded like NPR. Nor do they even see themselves as media companies, which could trigger First Amendment political speech obligations. Instead they are privately owned forums with user and content rules, including standards to remove content deemed harmful.
That grey area prompted the main line of criticism from partisan Republicans during Zuckerberg’s second day of testimony, where 54 members of the House Energy and Commerce Committee had four minutes to address or question the CEO. Across the aisle, the most strident criticism from liberals concerned Facebook’s failure to protect users’ personal information, and the way that the firm’s surveillance-based business model did not give users real control over their information, despite Zuckerberg saying otherwise.
But Facebook’s handling of controversial content will have the largest ramifications for America’s political landscape in 2018’s elections and afterward. That’s because the social media platform has become an indispensible tool for officeholders, candidates and issue advocates. Facebook has not just announced new rules to buy any political advertisement—requiring disclosures beyond current federal law, but has also said it will hire upwards of 20,000 people to screen posted content for hate speech and bullying, as well as to identify and remove violent threats and terrorist propaganda. Those new hires will be in addition to using computer algorithms to flag problem content.
Thus the question by North Carolina’s Rep. Hudson was key: what standard is it using to determine what’s acceptable and what isn’t.
“Congressman, this is an important question,” Zuckerberg began. “There are a couple of standards. The strongest one is things that will cause physical harm, or threats of physical harm. But there is a broader standard of hate speech and speech that might make people broadly uncomfortable or unsafe in the community.”
“That’s probably the most difficult to define,” Hudson said, cutting him off. “So I guess my question is, ‘How do you apply?’ ‘What standards do you apply to determine what’s hate speech versus what’s speech you may just disagree with?’
“Congressman, that’s a very important question that I think is one we struggle with continuously,” Zuckerberg continued. “And the question of what is hate speech versus what is legitimate political speech is something that we get criticized both from the left and right on what the definitions are that we have. It’s nuanced. And we try to lay this out in our community standards, which are public documents—that we can make sure you and your office gets to look through the definitions on this. But this is an area where I think society’s sensibilities are also shifting quickly. And it’s also very different in…”
“We’re running out of time. I hate to cut you off, but let me just say that, based on the statistics that [House GOP Whip Steve] Scalise shared [stating Facebook disproportionately prioritizes liberal content] and anecdotes we can provide you, it seems like there still is a challenge when it comes to conservatives.”
This drum beat continued throughout the House hearing, including the next-to-last questioner, Rep. Geoff Duncan, R-Georgia, who pushed Zuckerberg to say what Facebook couldn’t voluntarily follow the same First Amendment standards as the media—to allow even extremist viewpoints to be aired.
“Why not have a community standard for free speech and free exercise of religion that is simply a mirror of the First Amendment with algorithms that have a viewpoint that is neutral? Why not do that?” he asked.
“Well, congressman, I think we can all agree that certain content, like terrorist propaganda, should have no place on our network,” Zuckerberg replied. “The First Amendment, my understanding of it, is that kind of speech is allowed in the world. I just don’t think it is the kind of think we want to spread on the Internet. So once you get into that, you’re already deciding—you take this value that you care about safety, and that we don’t want people to be able to spread information that can cause harm. And I think our general responsibility it to allow the broadest spectrum of free expression that we can…”
“I appreciate that answer,” Duncan replied. “You’re right about propaganda and other issues there. And I believe the Constitution generally applies to government and says that Congress shall make no law respecting religion, and we won’t abridge the freedom of speech or the press. But the standard has been applied to private businesses, whether those are newspapers or other media platforms. And I would argue that social media has now become a media platform to be considered in a lot of ways like other press media. So I think the First Amendment does apply and will apply.”
That exchange didn’t end there. On Tuesday, Zuckerberg told the Senate that he does not consider Facebook to be a media company, but rather a technology company. He told the House that Facebook can only be responsible for biases in the content it creates, but has a general responsibility to screen for harmful content, whether it’s political propaganda or something personally harmful. In fact, a string of Republicans from rural areas upbraided Zuckerberg for allowing pages that appeared to be trafficking in illegal opioids.
Duncan continued. “Let me ask you this, what will you do to restore the First Amendment rights of Facebook users and ensure that all users are treated equally, regardless of whether they are conservative, moderate, liberal or whatnot?”
“Well, congressman, we make a number of mistakes in content review today that, I don’t think focus on one political persuasion,” he replied, repeating his earlier remarks. “And it’s unfortunate when that happens that people think we are focused on them. And it happens in different political groups…”
“Conservatives are the ones who raise the awareness that their content has been pulled,” Duncan interrupted. “I don’t see the same awareness being raised by liberal organizations, or liberal candidates, or liberal policy statements. And I think you’ve been made aware of this over the last two days. You probably need to go back and make sure that those things are treated equal.”
Zuckerberg grimaced after the Georgia congressman concluded his remarks. He is fully aware that right-wing websites have followed the example set by President Trump in making false statements and spreading political propaganda to an unprecedented degree for a sitting president. An important element of the transparency rules Facebook is adopting for political advertisers is to deter the most baseless attacks and extremist discourse.
But a parade of Republicans have signaled to Zuckerberg, and indeed to all of Silicon Valley, that the GOP will attack them as partisan messengers with a liberal bias if need be—and even seek to regulate their political speech-related actions. This signal is one strong takeaway from Zuckerberg’s two days of testimony, one that has strong echoes of the right’s attacks on public radio and TV dating to the mid-1990s under then-Speaker Newt Gingrich.
Steven Rosenfeld covers national political issues for AlterNet. He is the author of several books on elections, most recently Democracy Betrayed: How Superdelegates, Redistricting, Party Insiders, and the Electoral College Rigged the 2016 Election (March 2018, Hot Books).