Instagram Played A Much Bigger Role In Russia’s Propaganda Campaign
The scale of Russia’s three-year-long misinformation war on America keeps growing.
Testifying alongside Google and Twitter, Facebook was the focus at hearings on Capitol Hill last week, where impatient lawmakers demanded more details about the propaganda war waged on their platforms. Far more Americans had been exposed to Russia-backed propaganda on Facebook and its Instagram app than on any other platform: an estimated 146 million people, or about half of the U.S. population, had seen a Russia-funded post sometime between January 2015 and this past August, the company told Congress.
But the number is likely much higher. A new analysis of since-deleted posts from some of the known Russia-linked Instagram accounts suggests that the Kremlin’s misinformation campaign was far more widespread on Instagram than parent company Facebook, lawmakers, or others have known or acknowledged.
“For sowing division and finding wedge issues, Instagram is an ideal visual meme broadcast factory,” said Jonathan Albright, research director of the Tow Center for Digital Journalism at Columbia University, who examined the data surrounding a sampling of the roughly 120,000 posts created by the St. Petersburg-based Internet Research Agency.
So far, the bulk of the attention on Russia’s misinformation effort has been on Facebook, but the Russian campaign on Instagram may have been just as active, said Albright. With 800 million monthly users, Instagram is larger than Twitter and Snapchat combined. Facebook’s data and ad targeting tools, and an emphasis on highly visually engaging content, make it an especially potent machine for political meme-spreading. “In my opinion, the platform is far more impactful than Twitter for content-based “meme” engagement—especially for certain minority segments of the American population,” he said.
The promoted posts and micro-targeted calls to action—which in some cases encouraged people to protest against each other in real life—were part of a wider barrage of organic social media content intended not just to sway an election, as Facebook acknowledged last week, but to “sow division and discord,” and to create a sustained, ongoing relationship whereby American users subsequently posted about topics and issues pushed out by the accounts.
“Many of these ads and posts are inflammatory,” Facebook said in its testimony. “Some are downright offensive.” Last month, the Congressional Black Caucus put it more bluntly: Facebook was a “Trojan horse through which America’s vulnerabilities are exploited,” through pages that promoted “incendiary anti-immigrant rallies” and planted ads “designed to inflame and exploit racial, political and economic rifts in the U.S.”
Late last month, Facebook again revised up its estimates of the number of people who had seen the Russia-linked posts. The company said that in addition to the 126 million U.S. users who had seen the posts on Facebook, a total of 20 million more people had seen them on Instagram.
Albright’s data suggests that Facebook’s latest estimate is low. He examined a small fraction of the posts—just 200—and found that this batch alone had racked up 2.5 million likes, video views, and comments, and a total of 145 million projected total interactions. His projection also does not include untold numbers of cross-posts to other services like Facebook, Twitter, and Pinterest, or reposts that even now continue to spread on Instagram itself.
Adding More Detail To “Less Reliable” Data
One critical issue is how many Instagram users saw posts in the year and a half prior to October 2016, a month before the election. Under questioning at last week’s hearings, Facebook’s general counsel said that most of the 20 million Instagram users who saw a Russia-funded post—people who had not also seen one on Facebook—saw it after October 16, 2016. Prior to that date, he said, only “an incremental 4 million” had likely seen a Russia-linked post. But, he cautioned, during this timeframe “our data is less reliable.”
Data about that timeframe before October 2016—that is, the nearly two years leading up to election day—would be critical for understanding the Kremlin’s nearly-three-year long misinformation campaign. In response to an emailed question about the “less reliable” Instagram data, a Facebook spokesperson said that Instagram “impression logging” only goes back to October 2016, leaving the company without comparable Instagram data on the number of people who might have seen these posts before then. He declined to share other details, and did not respond to a further request for clarification.
The numbers gathered by Albright offer some more clarity. To do his analysis, he collected data on the top Instagram posts from 28 of the accounts—”like” history and comment statistics, and the complete content, including text, dates, and original URLs—using the analytics tools SocialBlade, Klear, and Keyhole.
Albright then verified two of the accounts’ data with Facebook’s own CrowdTangle tool, which helps advertisers to measure the impact of their Facebook campaigns. The company has since removed all metrics around the Russia-linked accounts from CrowdTangle. “It’s all gone,” Albright told me. “I got lucky and pulled that data weeks ago.”
During the period prior to October 2016, when Facebook said 4 million people had seen Russia-linked memes on Instagram, Albright’s sampling of accounts, sent between January 2016 and September 2016, garnered more than 600,000 interactions. Given that he was looking at just 28 of about 170 known accounts, he said, that would suggest that the images “reached far more than 4 million people in the United States prior to October 2016.”
Similarly, Albright reasoned that the rest of the Russia-linked Instagram posts likely reached far more than the 16 million U.S. users Facebook has estimated. Between October 2016 and August 2017, for instance, a single Instagram account, the LGBT-focused @rainbow_nation_us, garnered more than 9 million likes and comments. Another account, @blacks_go_viral, had 4 million video views over a similar time period.
The 28 accounts Albright examined had 2.4 million followers in total, with a median follower count of 75,000, and, he estimated, a typical life-span of about one and a half years. Given the nature of the content—focused around popular keywords like #blacklivesmatter or #sayhername—most followers “were likely to be fairly engaged,” Albright said. Without more precise data, however, it’s hard to know with certainty how many Americans saw these images in their Instagram feeds and elsewhere.
Zombie Accounts Keep Russian Memes Alive
The Instagram posts seemed aimed at pushing political and cultural pendulums in every possible direction. Sometimes sent out as targeted, sponsored posts nearly simultaneously across both Facebook and Instagram—a clever way to pick up new followers for their associated accounts—the memes at times included incendiary language about topics like immigration or policing, or happily encouraged pride around racial or sexual identity. There were images and hashtags referencing Black Lives Matter and targeting Ferguson and Baltimore; sponsored posts that promoted feminism and queer pride, that impersonated an American Muslim organization, or that backed Jill Stein, Bernie Sanders, and, most of all, Donald Trump.
But while some posts urged support for candidate Trump or derided Hillary Clinton, many of these posts had little to do with the election. In fact, many of the memes were sent long afterwards.
One Instagram post, sent this summer on the @secured_borders Instagram account, took aim at Republican Senator John McCain (R-AZ), a frequent critic of President Trump. In a photo, the investor George Soros is seen speaking to McCain. “Hey Johnny,” the text blares in red and yellow, “I’m paying you a fortune so listen to me closely! I don’t care how much cancer you have, get back to DC & backstab Trump any way you can! Globalist elites need you!” (McCain is co-sponsoring a bill aimed at bringing more transparency to online campaign ads.)
Secured Borders, which posted over 6,000 photos and videos and racked up over 48,800 followers until it was closed in August 2017, urged stronger immigration policy using inflammatory, sometimes racist and xenophobic language and imagery. This account and three others examined by Fast Company often echoed corresponding Facebook pages in style and tone, and they posted content 24 hours after similar posts on the Facebook pages, suggesting that the images may have been automatically republished. (Facebook provides advertisers a single portal through which they can schedule sponsored posts across Facebook and Instagram simultaneously.)
Another account, @Blackstagram_, which focused on African-American culture and activism, racked up 579,500 likes and comments across just eight posts this past summer. By the time it was shut down in August, the account had published at least 3,575 posts, and had over 199,000 followers.
While Facebook and Instagram have deleted the accounts and their posts, many of the images remain on other platforms and even continue to spread on both platforms, through possibly legitimate accounts that had reposted the content before it was removed.
Posts from the Instagram account Merican Fury, which had racked up 81,000 followers before it was removed in recent months, continue to spread across the platform. Just this week, one Instagram that shows a woman in a hijab and ascribes hatefulness to Islam, was reposted by an American conservative political meme account with 90,000 followers. It received over 5,000 likes in less than 48 hours.
One post by @secured_borders, published in July and still available elsewhere on Instagram, rails in typical fashion against sanctuary cities and “illegal aliens” with talk of swift justice for even the politicians who support the idea.
“If we want to REALLY end these ‘sanctuary cities’, we should start hauling handcuffed city officials who are accessories to the crime!” the text accompanying the post screams. The image itself only urges people to “SUE” the mayors supporting sanctuary cities. “Perp walk mayor blowhard into a FEDERAL slammer on prime time news. THAT will get this ‘sanctuary’ crap over with in quick time! [sic] … No city in the U.S. should be a sanctuary for illegal alien scum, period!”
Albright said he had found more than 5,000 “obvious” Internet Research Agency (IRA) posts currently residing outside of Instagram and Facebook and nearly two thousand memes still residing on Instagram. “The true reach of the IRA content has yet to be uncovered,” he said. “It’s likely that much of it has been missed in the audience reach and impact estimates.”
Facebook has revised its data around the Russia-linked influence campaigns a number of times. When it first acknowledged the Russia-linked accounts publicly in September, the company said it would keep the posts and metadata private in accordance with its policies. The following month, it began turning over Facebook sponsored posts and data to Congressional investigators, but the company didn’t publicly discuss the use of Instagram by Russia-linked accounts until after an inquiry by Fast Company in October.
A spokesperson for Senator Mark Warner (D-VA), who has led an investigation into Russian social media activity, did not respond to requests for comment about Russian activity on Instagram specifically.
Among the more surprising Russia-funded Instagrams to survive deletion by the platform was an illustration that was re-shared on the Instagram account for Ramona Magazine, a U.S. magazine for teenage girls. With a positive message about various female body types, the image was reposted from a now-deleted Russia-linked account, Intersectional Feminism (@Feminism_Tag) on July 19, and has racked up over 5,700 likes and 65 comments.
Another post by a Russia-linked account that found life after death was a video clip of Kanye West. Originally published by the account @black4black, the video—from a 2013 interview on Jimmy Kimmel Live—lives on through the Instagram account of a nonprofit organization that “supports & mentors young moms.”
“I refuse to follow those rules that society has set up and the way they control people with low self-esteem, with improper information, with branding, with marketing,” Kanye says in the clip.
Whether or not it was intended, the afterlife of the now-defunct accounts illustrates the lasting impact of the Kremlin’s misinformation war. A visually and politically compelling meme, perhaps true or false, perhaps targeted to just the right group of people, can go very far on sharing-centric platforms like Instagram. (Depending upon who’s looking at it, the angrier the idea the better, and sometimes, the weirder the better too).
Once it’s been shared widely and begun to circulate, the post slips into the already turbulent alphabet soup of American social media, followed by a steady stream of other memes from the same account. If it’s effective, the fake stuff not only stirs up and heats up the pot a bit, it also blends in, making it even harder to tell what’s real and what’s not.
Investigators looking for data on foreign misinformation campaigns on domestic social media platforms are still trying to determine what’s real. Even as social media companies face Congressional investigations and the prospect of federal regulation, the full extent of the Russian campaign on Instagram and Facebook remains a mystery locked up within Facebook’s servers. This makes it difficult to fully understand how Russian digital propaganda worked in the 2016 election.
But based on the data Albright has collected, the public disclosures and discussions remain significantly incomplete. Instagram, he concluded, “is a major distributor and re-distributor of IRA propaganda that’s at the very least on par with Twitter.”