Pages

Monday, 16 April 2018

How Russian Facebook Ads Divided and Targeted US Voters Before the 2016 Election

WHEN YOUNG MIE Kim began studying political ads on Facebook in August of 2016—while Hillary Clinton was still leading the polls— few people had ever heard of the Russian propaganda group, Internet Research Agency. Not even Facebook itself understood how the group was manipulating the platform's users to influence the election. For Kim, a professor of journalism at the University of Wisconsin-Madison, the goal was to document the way the usual dark money groups target divisive election ads online, the kind that would be more strictly regulated if they appeared on TV. She never knew then she was walking into a crime scene.
Over the last year and a half, mounting revelations about Russian trolls' influence campaign on Facebook have dramatically altered the scope and focus of Kim's work. In the course of her six-week study in 2016, Kim collected mounds of evidence about how the IRA and other suspicious groups sought to divide and target the US electorate in the days leading up to the election. Now, Kim is detailing those findings in a peer-reviewed paper published in the journal Political Communication. The researchers couldn't find any trace, in federal records or online, of half of the 228 groups it tracked that purchased Facebook ads about controversial political issues in that six-week stretch. Of those so-called "suspicious" advertisers, one in six turned out to be associated with the Internet Research Agency, according to the list of accounts Facebook eventually provided to Congress. What's more, it shows these suspicious advertisers predominantly targeted voters in swing states like Wisconsin and Pennsylvania.
"I was shocked," says Kim, now a scholar in residence at the Campaign Legal Center, of the findings. "I sort of expected these dark money groups and other unknown actors would be on digital platforms, but the extent to which these unknown actors were running campaigns was a lot worse than I thought."

Suspicious Groups

To conduct her research, Kim solicited volunteers to install a custom-built ad-tracking app on their computers. Kim describes the software as similar to an ad-blocker, except it would send the ad to the research team's servers rather than block it. Kim whittled the pool of volunteers to mirror the demographic, ideological, and geographic makeup of the United States voting population at large. She ended up with 9,519 individuals altogether, who saw a total of 5 million paid ads on Facebook between September 28 and November 8, 2016. 
From that massive pool, Kim took a random sample of 50,000 ads, and conducted searches for any that touched on one of eight politically sensitive topics: abortion, LGBT issues, guns, immigration, nationalism, race, terrorism, and candidate scandals (for example, Donald Trump's Access Hollywood tape or Hillary Clinton's private email server). After throwing out ads placed by the candidates or super PACs, the researchers were left with 228 individual groups. Kim then returned to the larger pool of 5 million issue-based ads to find all of the ones associated with those groups.
In total, groups that had never filed a report with the Federal Election Commission placed four times as many ads as groups that had. Until now, the FEC has failed to enforce rules about political ad disclosures online, and only recently voted to expand those disclosure requirements. That has allowed digital political ads—including the ones affiliated with the Internet Research Agency—to proliferate with no regulatory oversight.
Kim's research showed that in fact, these unregulated ads made up the majority of issue-based ads on Facebook during the course of her study. Facebook did not provide a comment before publication.
Among the groups that were not associated with any FEC records, Kim went on to differentiate between run-of-the mill dark money groups (think: non-profits and astroturf groups) and what she called "suspicious" groups. The latter had Facebook Pages or other landing pages that had been taken down or hadn't been active since election day. These suspicious groups also had no IRS record or online footprint to speak of at all. "Some groups, we were never able to track who they were," Kim says.
Of the 228 groups running divisive political ads, Kim classified 122 as suspicious. Then, in November of 2017, the House Intelligence Committee threw Kim a clue, releasingsome of the Internet Research Agency ads Facebook had turned over. Kim ran the House's list against her own, and found that one out of every six suspicious advertisers she had tracked was linked to the IRA.
Over the last few months, Kim says she's spent lots of weekends poring over these ads. "It was pretty depressing," she says. One ad shared by multiple suspicious groups read: "Veterans before illegals. 300,000 Veterans died waiting to be seen by the VA. Cost of healthcare for illegals 1.1 billion per year."

Swing States

The second part of Kim's research focused on who exactly these unregulated ads—including both standard dark money ads and Russian ads—targeted. She found that voters in Pennsylvania, Virginia, and Wisconsin, all states with tight races, were the most targeted. Specifically, voters in Wisconsin were targeted with gun ads about 72 percent more often than the national average. She also found that white voters received 87 percent of all immigration ads.
It makes sense that swing states would be more heavily targeted overall leading up to an election. And Kim didn't analyze the Russians trolls' targets independently from the other unregulated ads, given the small sample size of 19 groups. 
The aspect of her research that bothered Kim the most is that some of these groups could have been stopped—or at least discouraged—by stricter campaign finance laws. For instance, 25 percent of all the ads contained a message that mentioned Trump or Clinton by name. If those ads had appeared on television during that same time, they'd be considered "electioneering communications," meaning they'd have to include a disclaimer about who paid for the ad and disclose to the FEC the source of their funding. Online, anything goes.
"I think the biggest issue here are the loopholes," Kim says. "There is no adequate law that addresses social media platforms."
Kim called Facebook's recently announced plans to begin requiring disclosures and disclaimers on all political ads, including issue-based ads, a "step in the right direction." She does, however, sees some flaws in Facebook's plans. The company has said it will begin requiring both political advertisers and the people running large Facebook Pages to authenticate their identities by providing a mailing address and a government-issued form of identification. But Kim notes that many of the Pages in her research were not large at all. Instead, they appeared to be small Pages, linked to other small Pages, all of which ran identical ads.
In one case, four separate "suspicious" pro-Trump pages all ran the same ad that read, “Support 2nd Amendment? Click LIKE to tell Hillary to Keep Her Hands Off Your Guns.” The next phase of Kim's research will focus on analyzing those networks.
Ultimately, though, Kim's work suggests a sort of inevitability about the Internet Research Agency's actions, given the United States' lax campaign finance laws. It also shows that while the Agency's ads were divisive and at times despicable, there were other dark money groups on Facebook spreading similar messages, and far more of them. And they were doing it in way that, for now at least, is totally legal. It raises a crucial question about political divisiveness in America: Who's the bigger threat? Russian trolls or ourselves?

No comments:

Post a Comment