Cambridge Analytica and the Perils of Psychographics

Allegations made by Christopher Wylie, the former Cambridge Analytica employee, have reignited the debate about whether psychographic targeting can actually influence people’s behavior.Photograph by Antonio Olmos / Eyevine / Redux

In September, 2016, Alexander Nix, the C.E.O. of Cambridge Analytica, the data and messaging company that was working at the time with Donald Trump’s supposedly flagging Presidential campaign, explained his firm’s work like this: “If you know the personality of the people you’re targeting, you can nuance your messaging to resonate more effectively with those key audience groups.” The fancy term for this is psychographic targeting. A few weeks later, Trump won the Presidency, against all odds and predictions, sending political operatives and journalists scrambling for explanations. “There was a huge demand internally for people to see how we did it,” Brittany Kaiser, Cambridge Analytica’s former business-development director, told the Guardian last Friday. “Everyone wanted to know: past clients, future clients.”

Whether Cambridge Analytica’s targeting work actually swayed the outcome of the election has been a subject of debate since then—because the firm’s record is spotty, psychographic targeting in political campaigns is a relatively new concept, and it has not yet been definitely shown that C.A. successfully used these methods on behalf of Trump’s campaign.

Christopher Wylie, the former C.A. employee who recently came forward to detail how the company improperly acquired personal data from fifty million Facebook users, has said that the company used that data to create a “psychological warfare mindfuck tool.” But Aleksandr Kogan, the Cambridge University researcher who provided the company with the Facebook data, has described it as “not that accurate at the individual level.” Kogan’s conclusion tracks with research that has been done by the U.K.-based Online Privacy Foundation, whose research director, Chris Sumner, recently told me that psychographics are much more accurate for groups rather than individual people.

Earlier this week, Chris Vickery, the director of cyber-risk research at the cyber-security firm UpGuard, announced on Twitter that he had found the code used by Cambridge Analytica for its election work. It was located in a publicly accessible online repository maintained by an employee of a small software-development firm in British Columbia called AggregateIQ (A.I.Q.) that was under contract with the S.C.L. Group, the parent company of Cambridge Analytica. This code provided a look at the internal mechanics of what the company had been doing.

The software uncovered by Vickery appears to have been first created in 2014, when Cambridge Analytica was working on a number of midterm election campaigns, and used at least until the period in the 2016 cycle when the company was working for Senator Ted Cruz during his Presidential primary campaign. According to Vickery’s initial report, the A.I.Q. repository includes “a set of sophisticated applications, data management programs, advertising trackers, and information databases that collectively could be used to target and influence individuals through a variety of methods, including automated phone calls, emails, political websites, volunteer canvassing, and Facebook ads.” Its core instrument, which its authors called the “Database of Truth,” was designed to gather and integrate voter-registration data, consumer data, polling data, and data “from third-parties”—it is possible that the fifty million profiles of unwitting Facebook users that Cambridge Analytica acquired would fall under this final, vague category. Vickery is currently combing thousands of pages of code to see how, or if, it might have used people’s psychological touch points.

While Vickery looks for proof of C.A.’s psychographic work, Kimberly Foxx, the state’s attorney of Cook County, Illinois, has filed a civil suit against both Cambridge Analytica and Facebook for deceptive business practices, claiming, among other things, that “armed with swaths of misappropriated data, Cambridge Analytica created ‘psychographic profiles’ on every American adult, which it claims helped it have significant influence on the outcome of the 2016 presidential election.” The suit takes for granted that C.A. engaged in psychographic targeting, and argues that psychographic methods bypass “individuals’ cognitive defenses by appealing directly to their emotions, using increasingly segmented and sub-grouped personality type designation and precisely targeted messaging based on those designations.”

The question remains whether this kind of targeting can actually influence people’s behavior. Last summer, at the DefCon hacking conference in Las Vegas, Sumner, who has been studying variants of this question for the last seven years through his work at the Online Privacy Foundation, presented the group’s latest research.

Sumner and his research partner, Matthew Shearing, used survey questions—along with a process, similar to one developed at Cambridge University, that enables social scientists to find subjects based on Facebook’s understanding of their psychological makeup—to evaluate 2,412 people’s propensity for a single underlying psychological tendency, in this case authoritarianism. They then created advertisements that either advocated or opposed state-sanctioned mass surveillance. An ad that read, “Terrorists—Don’t let them hide online. Say yes to mass surveillance,” with a picture of a mangled, bombed-out buildings appealed to people with higher authoritarian tendencies. An ad that said, “Do you really have nothing to fear if you have nothing to hide? Say no to mass surveillance,” alongside a photo of Anne Frank, appealed to those on the low end of the authoritarian spectrum.

Then Sumner and Shearing flipped the script. Respondents with an authoritarian bent reacted positively to an ad with the words “They fought for your freedom—Don’t give it away. Say no to surveillance” superimposed over a photo of a D-Day landing. And those who would otherwise be described as anti-authoritarian were swayed to support surveillance by an ad that listed a host of bad things, including human trafficking, cyber crime, terrorism, and money laundering, with the words “Crime doesn’t stop where the Internet starts. Say yes to surveillance.” By rewording the ads to appeal to the respondents’ underlying psychological disposition, the researchers were able to influence and change their opinions. According to Sumner, “Using psychographic targeting, we reached Facebook audiences with significantly different views on surveillance and demonstrated how targeting . . . affected return on marketing investment.” Psychological messaging, they said, worked.

Facebook has come to this conclusion, too. As Cook County’s lawsuit points out, Facebook has undertaken a number of research projects—without the consent of its users—aimed at understanding how the platform might be used to influence user behavior. Most famously, there was its mood-control experiment, published in Proceedings of the National Academy of Science, in 2014, in which the company manipulated its news feed so that seven hundred thousand of its users saw primarily positive or primarily negative content. The goal was to find out if “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” The company found that they could. Even more germane, in 2010, Facebook successfully showed that it could influence voter turnout. These were not purely academic inquiries. The Cook County lawsuit also points out that Facebook is more valuable to both its business and political clients if it can demonstrate that it can be used “to manipulate its users into making decisions that they want them to make.”

The lawsuit seeks damages on behalf of the residents of Illinois, and if the court accepts its demand for a jury trial, discovery promises the possibility of exposing truths belied by promotional material and obscured in software. It may seem quixotic to sue a slippery outfit like Cambridge Analytica or a behemoth like Facebook. But it is no more quixotic than a handful of computer programmers endeavoring to upend liberal democracy with nothing more than strings of zeroes and ones.

This article has been corrected to clarify that Sumner and Shearing did not use a tool created by Facebook itself in their research.