Facebook Study Pins ‘Echo Chamber’ Of Opinions On Users’ Choices

Facebook Study Pins ‘Echo Chamber’ Of Opinions On Users’ Choices

By Sarah Frier, Bloomberg News (TNS)

SAN FRANCISCO — Facing heat for conducting a psychological experiment on unwitting users, Facebook Inc. last summer began another study to see whether the social network insulates members from diverse opinions. The conclusion: Nope.

Facebook researchers sought to determine whether its customized news feeds seal off users from a variety of perspectives. The study concluded that personal choice about what to click on has a greater effect than Facebook’s own formula for the feed, the collection of postings to a user’s accounts, according to findings published Thursday in Science magazine.

Now that almost half the Internet-connected population is on the social network, Facebook is seeking to understand its cultural effects, for insight into how to create products and for academic purposes. The Science report adds to the limited public knowledge about how the network’s 1.4 billion users behave socially.

Researchers began the study in July, as Facebook faced scrutiny from regulators for conducting an experiment without user consent to see how making news feeds more positive or negative would affect peoples’ moods. The Menlo Park, California-based company in October tightened the standards for its research policies.

The study published Thursday, for which researchers reviewed 10.1 million U.S. accounts anonymously, acknowledges an “echo chamber” effect in which users predominantly see views they agree with. But researchers found that the network’s news feed algorithm isn’t largely responsible.

“Our work shows that social media exposes individuals to at least some ideologically cross-cutting viewpoints,” wrote the researchers, who came from Facebook and the University of Michigan. “The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.”

In a completely random news feed, more than 40 percent of the hard-news content would be from opposing points of view, the study found. Accounting for friendships and the popularity of postings, that percentage declines. Liberals tend to see a narrower universe of opinions: They get 24 percent of their hard-news from conservatives while conservatives get 35 percent from liberals.

Facebook’s news feed algorithm, which gives priority to postings from close friends and other content deemed relevant to the individual, reduces opposing opinions by eight percent for liberals and five percent for conservatives.

What a user clicks on has a greater effect cumulatively. A liberal user’s clicks actually cut exposure to other views by a little less, six percent, but a conservative’s choices reduce such exposure by 17 percent, the study found.

“On average, more than 20 percent of an individual’s Facebook friends who report an ideological affiliation are from the opposing party, leaving substantial room for exposure to opposing viewpoints,” the study found.

Photo: Mixy Lorenzo via Flickr