Reprinted with permission from AlterNet.
Maybe you saw it, too. On Thursday, I read a post from Facebook executivestouting their determination to reel in the propaganda monster their platform had enabled in 2016’s election.
“We are committed to protecting legitimate political discussion within our community,” Facebook said, citing its “action plan against foreign interference,” which includes “hiring 10,000 people including ad reviewers, engineers and security experts and combining their skills with advances in AI [artificial intelligence] and machine learning to identify and remove content violations and fake accounts.”
Facebook’s moves come as congressional hearings continue on how Russians, with (as we now know) a more sophisticated understanding of social media than Hillary Clinton’s 2016 campaign, planted thousands of propagandistic memes on dominant online platforms including Facebook. These moves come as the Federal Election Commission is mulling new disclosure requirements for the election platforms. (Unless anticipated upcoming FEC vacancies stall this action.)
But what’s most important, amid the recent news surrounding Facebook’s role as one of 2016’s top political propaganda platforms, is what is missing from Facebook’s statements. It’s also missing from the few media accounts about possible federal regulation of Facebook. What’s missing is the role of another communication platform owned by Facebook: Instagram. This is what Jonathan Albright, a Columbia University Journalism School professor and researcher recently told the Senate Intelligence Committee, and detailed in a long article on Medium.com.
The “most important development about the ongoing Facebook investigation isn’t the tenfold increase in the company’s updated estimate of the organic reach of ‘ads’ on its platform,” Albright wrote. “While the estimate increasing the reach of IRA content from 10 million people to 126 million people is surely a leap, after last week’s [Senate] testimony, the real question we should be asking is: how did we suddenly arrive at 150 million? The answer is Instagram.”
“My following analysis shows that Facebook’s sibling property, Instagram—a service larger than Twitter and Snapchat combined—should be seen as a major influence, targeting and engagement hub for the spread of political propaganda,” Albright continued. “How do I know? Because I amassed a huge trove of content, analytics numbers, and did the analysis to prove it. To show the reach and use of Instagram for political propaganda the last two years, I’ve collected the following evidence: First, the full profile analytics, post ‘like’ history and comment statistics, and the complete content—including text, dates and original URLs—for a sample of Instagram posts from 28 of the 170 removed accounts. These posts accounted for 2.5 million recorded interactions (not estimated ‘shares’) and 145 million projected total interactions based on Socialblade, Klear, and Keyhole influencer data.”
Albright is one of the foremost investigative reporters, dealing with millions of real recorded social media posts and interactions, including tracing how content spreads. His latest conclusion is that Instragram, which Facebook owns, has been overlooked in the congressional scrutiny of its parent company and Google and Twitter in understanding how social media fanned and exploited propagandistic divisions among the electorate in 2016. More important, what Albright highlights remains online today—including what he labels the thousands of “zombie” memes and other posts from 2016.
Instagram More Influential Than Twitter
“I argue here that Instagram is more pervasive than Twitter for political meme-spreading as well as viral outrage video-based behavioral re-targeting,” he said. “Part of the reason for this is because it uses the same range of Facebook’s universe of sophisticated ad targeting infrastructure—including Lookalike and Custom Audiences. The Instagram platform can even link even video views to direct response and campaign objectives.”
Albright goes further, explaining that Silicon Valley top coders built the communications network that not only spreads viral content, but leaves it lingering online, and mines user interactions and compiles psychological profiles to further this provocative dynamic.
Let’s take these step by step.
“Instagram is also a major re-distributor of IRA [Internet Research Agency, a Russian company] memes: Two unofficial third-party ‘re-sharing’ apps on Instagram have circulated and pushed IRA content far beyond the realm of Instagram and Facebook, and embedded it all over the internet. This includes cross-posting of memes and posts from removed accounts from Instagram back into Facebook, Instagram, and also into Twitter. These apps also helped the memes get over to Pinterest.”
This recycling has led to a stunning observation—one that partly accounts for the deepening poisonous nature of arch-partisan politics—which is the observation that much of 2016’s political propaganda is still circulating, despite Facebook and other sites taking steps to close these accounts.
“All of the [IRA] accounts I’ve studied here have been removed, so the fact that much of their content is still lingering is a critical concern,” Albright said. “As far I can tell, this creates a ‘zombie account’ situation. Since this content, mentions and links actually didn’t disappear when the original profiles were taken down, the true reach of the IRA content has yet to be uncovered. It’s likely that much of it has been missed in the audience reach and impact estimates.”
2016’s Propaganda Was Different
Albright goes to great lengths to describe how the political propaganda spread on social media in 2016 differed from what was the standard political campaign TV and radio advertising. First, it wasn’t broadcasting at all. It wasn’t a dozen TV or radio ads in the fall before a general election aimed at swaths of swing voters. It was narrowcasting, created and fine-tuned to the prejudices of much more selectively targeted individuals.
“The Russian ‘ads’ were mostly promoted posts and micro-targeted calls to action,” he wrote. “They encouraged Americans to follow fake Pages, and engage with and vocalize about political and social issues through comments, ‘Reactions,’ and ‘Likes.’ Once these things happened, the campaigns used Facebook’s analytics tools and ad tech infrastructure to track and target them further. On Facebook, Instagram, and across the internet through Facebook’s ‘Audience Network’ and other partner marketing channels like Axciom.”
Notably, social media’s visual elements preyed on the psychological aspects of viewer’s impressions and quick takes.
“From what I’ve seen—and I’ve seen thousands of threads and posts—I feel this makes the debates on Instagram more focused on the issue, video, or meme sitting at the top of the thread. For sowing division and finding wedge issues, Instagram is an ideal visual meme broadcast factory,” wrote Albright. “Just as Facebook’s recent statement describes—these ‘ads’ were part of a wider effort intended not just to ‘sway an election,’ but rather to ‘create a sustained relationship whereby users subsequently posted about topics and issues pushed out by the accounts.’ These are something like a psychological Trojan horse.”
The technical backends of Facebook and Instagram weren’t just Trojan horses, sneaking messaging in past unsuspecting viewers. They were egging on viewers to more embittered or emotional thoughts or action, then cataloging those reactions for further use by ad-buying propagandists.
“Users were encouraged to follow profiles, ‘turn on their notifications’ for the account’s newly shared posts, respond to outrage videos, and visit the hundreds of websites and online stores the Instagram profiles linked to in their bios,” Albright wrote.
Silicon Valley Created This Monster
“As I’ve consistently argued, American tech companies have set up the infrastructure needed to ‘hack an election,’” Albright wrote. “Russian groups simply purchased the ability to target specific groups of Americans before, during, and after the election through Facebook’s self-service psychographic advertising services.”
“This provides the impetus whereby Americans can be tracked and re-targeted through other affiliate technologies, data profiling services, and served ‘cookies’ and device fingerprinting outside the scope of Facebook’s reach,” he continued. “The point is, the effort wasn’t just to ‘influence an election.’ It wasn’t just to get people to ‘follow’ the Pages and ‘turn on notifications’ and sign up for fake events.”
Albright’s analysis continues to describe how social media is just transforming political campaigns and elections, and how the websites behind social media platforms are prying into the cracks and crevices of people’s lives and compiling digital dossiers to provoke behaviors and outcomes.
“These campaigns were done with the intent to direct people to third-party websites, install mobile apps, engage with outrageous ‘viral’ content, and collect emails, address, and payment information during ‘shopping cart’ checkouts,” Albright said. “Even at its most basic, the Facebook ad infrastructure can be used to unknowingly recruit friends, family members, and co-workers for sponsored messages and political data-driven micro-targeting.”
“People were also prompted to take action on deep-seated controversial issues,” he continued. “In some cases, ‘ads’ succeeded in recruiting Americans to physically attend the fake Facebook ‘Events.’ But this, of course, probably meant better re-targeting. The persistence of the Facebook ad tech and self-service audience segmentation tools greatly enhance the ability of coordinated influence campaigns to shape longer-term behaviors and attitudes at the population level.”
Albright’s message during his Senate Intelligence Committee testimony is that we need to consider the deep “behavioral reach and data privacy implications of these attempts to create a ‘sustained relationship with the Pages.’” While that warning may be too much to ask of a GOP-dominated Congress, since no political party wants to question the tools that elected it, his warning is sobering.
It also shows how meager Facebook’s latest public relations messaging is in response to the political turmoil its network engendered.
“We’re improving our systems to keep activity on Facebook authentic,” Facebook’s Action Plan said. “This includes hiring 10,000 people including ad reviewers, engineers and security experts and combining their skills with advances in AI and machine learning to identify and remove content violations and fake accounts… We’re updating our policy to block ads from Pages that repeatedly share stories marked as false by third-party fact-checking organizations.”
These steps are long after the Trojan horses left Facebook’s barns and rely on much the same tools Facebook used to create a system and platform that it cannot control—to self-police. As Albright strongly suggests, we are in a politicized world and we have no good way to stop the digital barbarians at the gate.