In December 2014, John Rust wrote to the head of the legal department at the University of Cambridge, where he is a professor, warning them that a storm was brewing.
According to an email reviewed by WIRED, Rust informed the university that one of the school’s psychology professors, Aleksandr Kogan, was using an app he created to collect data on millions of Facebook users without their knowledge. Not only did the app collect data on people who opted into it, it also collected data on those users’ Facebook friends. He wrote that if just 100,000 people opted into the app, and if they had an average of 150 friends each, Kogan would have access to 15 million people’s data, which he could then use for the purposes of political persuasion. Journalists had already begun poking around, and Rust wanted the school to intervene, arguing Kogan’s work put the university at risk of “considerable media attention, almost entirely adverse.”
“Their intention is to extend this to the entire US population and use it within an election campaign,” Rust wrote of Kogan and his client, a little-known political consulting firm that went on to be called Cambridge Analytica. He predicted, “I simply can’t see this one going away.”
Six months later, Donald Trump announced his candidacy for President of the United States, and launched a campaign that depended, in part, on Cambridge Analytica’s work. His shocking election victory in 2016 thrust the firm into the spotlight, earning the company contracts with major commercial clients around the world. But more than a year after it helped get Trump in the White House, news broke that Cambridge Analytica had hired Kogan to harvest the data of tens of millions of American Facebook users without their consent, stoking international outrage from those who felt their privacy had been violated.
As director of the university’s Psychometrics Centre, which researches and develops psychological tests, Rust knew better than most how Facebook data can be manipulated. It was researchers in his own lab who first discovered that Facebook likes could be used to deduce all sorts of sensitive information about people’s personalities and political persuasions. But he says the goal of that research—and the goal of his 40 years in the field—was to warn the world about what can be done with this data and the dangers of allowing it to be so freely traded.
Years later, Rust takes no joy in being proven right. “We could see even four years ago the potential damage it would do, and there was nothing we seemed to be able to do to stop it,” he says today.
Facebook now acknowledges that Kogan collected the Facebook data of up to 87 million Americans and sold it to Cambridge Analytica. But as CEO Mark Zuckerberg and his team attempt to clean up the mess, Rust is hardly being hailed as some digital Paul Revere. Instead, his entire department and indeed his entire legacy have been swept up with both Kogan and Cambridge Analytica, accused by Zuckerberg himself of committing the very violations that Rust tried to warn against. “Our number one goal is to protect people’s data first and foremost,” says Ime Archibong, Facebook’s director of product partnerships. “We have an opportunity to do better.”
Since this spring, when news of the scandal broke, Facebook has cut off several apps used in the Psychometrics Centre’s work, and in his testimony before Congress earlier this year, Zuckerberg suggested that “something bad” might be going on within the department that required further investigation from Facebook. In written responses submitted to Congress last week, Facebook mentions the Psychometrics Centre 16 times, always in conjunction with Kogan, who briefly collaborated with the researchers there.
Now Rust and others await the results of Facebook’s investigation, which is itself on hold until UK regulators finish their own probe. And yet the Centre’s reputation already seems inextricably bound to the fallout of the Cambridge Analytica scandal. Rust fears the condemnations from Facebook have not only tainted the legacy of the department, they’ve brought a key area of research to a halt at a time when Rust insists it’s needed most.
Rust believes the science of psychometrics was born to be abused. At its most basic, it is the science of measuring people’s mental and psychological traits, strengths, and weaknesses. It forms the basis of the SAT and IQ tests, but it’s also been used for all manner of dark and disturbing ends, including eugenics.
“It has a long history of being a science where people say, ‘Gee, that’s amazing. It will change the world.’ And it does, but it doesn’t always change the world in the way people want it,” Rust says. He is sitting near the almost empty row of computers that comprise the tiny Psychometrics Centre. It is modestly demarcated with a slender sign resting on a cabinet and a finger puppet of Sigmund Freud and his couch propped up against it.
Early on in his career studying psychology, Rust saw how IQ tests and other aptitude tests were being used to justify discrimination against people of different races, locking them out of academic and professional opportunities. One of his PhD professors, Hans Eysenck, was a prominent proponent of the theory that people of different races were genetically predisposed to have different IQs.
“There I am stuck in a field, which was shifting increasingly to the right, and I felt there was an obligation to show their approach was wrong,” says Rust, who describes himself as an anarchist in his younger years. “Most people would have just given up the field. I didn’t. We had to address all of these issues.”
Rust launched the Psychometrics Centre at the City University of London in 1989, where he initially focused on developing an intelligence test for children. In 2005, he moved the Centre over to the University of Cambridge. But it wasn’t until 2012, and the arrival of an academic named David Stillwell, that the Centre’s work shifted to social media. While most personality tests are administered by schools and businesses that never show participants their results, Stillwell had developed an app that let people take personality tests on their own and get their results. They could also opt to share the results with the researchers.
The app, called myPersonality, also plugged into Facebook and asked participants to opt in a second time if they wanted to share data from their Facebook profiles. It only collected data on the people who opted in, not their friends, and included a disclaimer saying the information could be “stored and used for business purposes, and also disclosed to third parties, for example (but not limited to) research institutions, in an anonymous manner.” MyPersonality went viral, amassing data on 6 million participants between 2007 and 2012, about 30 to 40 percent of whom opted to share their Facebook data with the researchers, as well.
‘We could see even four years ago the potential damage it would do, and there was nothing we seemed to be able to do to stop it.’John Rust, Psychometrics Centre director
In March of 2013, Stillwell, a PhD student named Michal Kosinski, and a third researcher coauthored a now-famous paper showing that Facebook likes, even for seemingly benign topics like curly fries or thunderstorms, could be used to predict highly sensitive details about people, including their sexual orientation, ethnicity, and religious and political views. At the time, Facebook Page likes were still public, meaning anyone could collect information on everyone who liked a given Page on their own. The paper warned about how these predictions “could pose a threat to an individual’s well-being, freedom, or even life,” and concluded with a plea for companies like Facebook to give users “transparency and control over their information.”
“It was scary. It still is,” Rust says of the revelation. “It showed communicating through cyberspace was completely different than writing a letter or having a telephone conversation. A digital footprint is like your avatar.”
He says he hoped the research would bring about a crucial conversation about what it really means to let algorithms run amok on massive data sets—conversations that were happening largely behind closed doors in Silicon Valley. The paper, and the ones that followed, earned the two researchers, and the Psychometrics Centre, international attention. In 2013, the Centre began licensing the anonymous data set for other academics to use, leading to dozens of additional research papers. Those collaborators had to agree to terms that prohibited sharing the data, de-anonymizing the data, or using it for commercial purposes.
At the time, Facebook’s terms prohibited selling data or transferring it to data brokers. But it did allow app developers to share data for academic research under certain terms. Users needed to consent to their data being shared, for example. The developer also needed to ensure other researchers agreed to the terms before accessing it—you couldn’t just put data sets up on a website. Facebook’s terms are continually changing, and according to the company, developers are bound by the most current ones. That means the onus is on developers to ensure their apps are aligned with Facebook’s terms every time they change.
Rust says the researchers in his department believed they were complying with all of Facebook’s rules, and back then, at least, Facebook seemed to agree. In 2011, the company paid Stillwell’s way to a workshop on using Facebook data for research, and in 2015 a Facebook researcher invited Kosinski to present his findings at a conference in Long Beach, California. If there was anything wrong with the work they were doing, neither Facebook nor the researchers seemed aware of it.
Around 2012, Rust invited Kogan, a new professor working in the university’s psychology department, to meetings at the Psychometrics Centre. Kogan had established the Cambridge Prosociality and Well-Being Lab, which, according to its website, studied “the psychology of human kindness and well-being.”
“I thought this was a nice, hospitable thing to do to a new university lecturer,” Rust says of the invitation. He now regrets that decision.
Kogan became intimately familiar with the Psychometrics Centre’s data and its models. He was even an examiner on Kosinski’s dissertation. Then, in 2014, a year after Stillwell and Kosinski’s landmark paper published, Kogan and his partner Joe Chancellor launched a firm called Global Science Research. Its client, SCL Elections, which would later become Cambridge Analytica, wanted Kogan to work with the Psychometrics Centre to amass Facebook data on the American electorate and use it to understand people’s personality types for the purpose of political advertising. But the relationship between Kogan, Stillwell, and Kosinski soon soured over contract negotiations that would have left the Psychometrics Centre with a much smaller cut of the budget than originally discussed. Stillwell and Kosinski ultimately declined to work with Kogan, and afterward, the university made Kogan sign a legal document saying he wouldn’t use any of the university’s resources—including its data—for his business.
“We were just watching in a state of, what’s going to happen next?” Rust says.
What happened next is the stuff of breaking news push alerts. Over the summer of 2014, Kogan and Chancellor recruited people to take personality quizzes through their own app called This Is Your Digital Life, thereby gaining access to their Facebook data, as well as the data on tens of millions of their friends. Over the course of that summer, they amassed 50 million records, 30 million of which they sold to Cambridge Analytica, despite Facebook’s prohibition on selling data. Kogan maintains he didn’t know he was violating Facebook’s policies, which he argues the company rarely enforced anyway.
As Rust heard reports about this work from PhD students working with Kogan, he says he grew increasingly concerned. Meanwhile, a reporter from The Guardian, who went on to break the story about Kogan’s methods in 2015, had begun poking around, asking Kogan, Stillwell, and Kosinski questions. According to emails reviewed by WIRED, the researchers worried their work would be lumped in with Kogan’s. It was in this environment at the end of 2014 that Rust decided to sound the alarm.
Last Thursday, Aleksandr Kogan walked into a Starbucks just south of Central Park, looking almost Zuckerbergian in his light blue t-shirt and jeans. He and his wife have been living in New York since November, a few months before, as he puts it, “one hell of a nuclear bomb” dropped into their lives. In March, The New York Times and The Guardian broke the story that made Kogan front-page news and led to him being banned from Facebook. The company has repeatedly cast Kogan as a singularly bad apple, while the armchair sleuths of the internet have used his Russian heritage and research ties to St. Petersburg University to accuse him of being a Russian spy. Now, as he waits for his contract at Cambridge to run out, he knows his career in academia is over.
“This has not worked out well for me, personally,” Kogan said loudly, unafraid of who might be listening. This is one of many reasons that he’d make a lousy spy, he added with a laugh.
Kogan has already testified in front of the UK Parliament, and on Tuesday, he’ll appear at a Senate hearing, too. When he does, he’ll have a different version of events to share than Rust. For starters, Kogan has claimed repeatedly that Stillwell and Kosinski’s methods for predicting people’s personalities and other traits weren’t actually all that effective. That argument is hard to square with the fact that Kogan sold these very methods to Cambridge Analytica. And yet, he’s not alone in making this claim. Other academics and political operatives familiar with Cambridge Analytica’s work have accused the company of selling snake oil.
‘This has not worked out well for me, personally.’
Aleksandr Kogan
Kogan also says Rust is writing a revisionist history of events, casting himself as a whistle-blower when, Kogan says, the Psychometrics Centre wanted in on the project up until contract negotiations fell through. “When they couldn’t get back on the project, they were like, ‘This is an ethics violation,’” Kogan says, pointing a finger sarcastically in the air. “Never has greed served someone so well.”
He concedes, though, that everyone would have been better off had they heeded Rust’s warning back then, and admits that, as he gobbled up this data, he was blind to the risk of public backlash. He’s sorry about the chaos he’s created. “If people are upset, then fuck yeah, we did something wrong,” he says.
But he insists he’s not the only one. The core problem, he argues, is not that “something bad” is happening at the Psychometrics Centre but, rather, that Facebook gave user data away to developers with minimal oversight for years. The company celebrated the work of Stillwell and Kosinski. It hired Chancellor, Kogan’s partner, for its research team and gave Kogan specially curated data sets for his own research. Now Facebook insists it was unaware that any of these academics may have been violating its policies.
“We had no understanding of the violations that were potentially happening,” says Facebook’s Archibong. “This is the reason we’re stepping up and investigating now.”
The University of Cambridge says it is also conducting its own investigation. “We are undertaking a wide-ranging review of all the available information around this case,” a spokesperson said. “Should anything emerge from this review, or from our request to Facebook, the university will take any action necessary in accordance with our policies and procedures.”
But for Kogan, all of this scapegoating of academics is a distraction. If Cambridge Analytica had collected the data itself, instead of buying it from Kogan, no one would have violated Facebook’s policies. And yet, tens of millions of people would still have had their data used for political purposes without their knowledge. That’s a much deeper problem that Facebook—and regulators—have to grapple with, Kogan says. On this point, at least, he and Rust see eye to eye.
Since this spring, Facebook has suspended just about every app the Centre ever touched. Archibong says the company will reinstate the apps if it finds no evidence of wrongdoing, but that may take a while. Facebook is waiting out an investigation by the UK Information Commissioner’s Office before it proceeds with its own audit. In the meantime, the company won’t comment on what policies the Psychometrics Centre’s apps may have violated, leaving the researchers in limbo.
“It’s just a PR exercise for them to say they’re doing something about it,” says Vesselin Popov, director of business development for the Psychometrics Centre.
In addition to myPersonality, Facebook suspended an app called YouAreWhatYouLike, developed in partnership with a company called CubeYou, which has also been banned from Facebook. (Facebook says CubeYou was suspended because of a “suspected violation independent of its ties to the Psychometrics Centre.”) That app showed people their personality predictions from Facebook likes, as well as predictions about their friends. According to CubeYou, the company never sold that data, but did get consent from users to store and share it anonymously, in accordance with Facebook’s terms at the time. Facebook also suspended a tool developed by the Centre called Apply Magic Sauce. It included both a consumer-facing app that let users take personality quizzes, as well as an API, which businesses could use to apply the Centre’s personality-profiling models to their own data sets. The Centre says it never sold that data, either, though it did make money by selling the API to businesses.
Facebook’s decision has radically reduced the Centre’s ability to conduct social media research at a time when, Popov argues, it’s crucial. “One of Facebook’s responses to this is we’ll set up an academic committee that we’ll fund and staff with senior academics we consider worthy,” Popov says. “That, for me, is a total farce. It’s the people causing the problem pretending they’re the ones fixing it.”
Of course, Facebook’s leaders might say the same thing about the researchers at the Psychometrics Centre. In May, The New Scientistreported that login credentials to millions of anonymized records collected by Stillwell and Kosinski had been uploaded to Github by an academic at another university.1 That’s despite the strict terms Stillwell and Kosinski had in place. The data was exposed for four years. Stillwell declined to comment for this story, but in a statement on the app’s website, he wrote, “In nine years of academic collaborations, this is the only such instance where something like this has occurred.” The breach shows there’s no guarantee that even well-meaning developers can keep Facebook data secure once it’s been shared.
If Rust accepts any blame, it’s that he didn’t foresee earlier that the research his department was conducting into the misuse of Facebook data might, in fact, inspire people to misuse Facebook data. Then again, even if he had, he’s not entirely sure that would have stopped him. “I suppose at Cambridge if you know the research you’re doing is groundbreaking, it can always be used for good or bad,” he says.
Rust says he is cooperating with the Information Commissioner’s Office’s investigation. The Information Commissioner, Elizabeth Denham, wouldn’t comment for this story beyond saying she is “considering the allegations” leveled against the Centre by Facebook. Rust, however, says he’s submitted emails and other documentation to Denham’s office and has tried to impress upon them the urgent need for regulatory oversight in the field of artificial intelligence.
“AI is actually a bit like a psychopath,” he says. It’s adept at manipulating emotions, but underdeveloped morally. “In a way, machines are a bit like that. They’re going through a stage of moral development, and we need to look at how moral development happens in AI.”
Of course, when Rust says “we,” he’s not talking about himself. He plans to retire next year, leaving the work of solving this problem to a department he hopes can survive the current turmoil. At age 74, he’s already seven years past retirement age, but still, leaving with things as they are isn’t easy. From behind his horn-rimmed glasses, his eyes look melancholy, and maybe even a little glassy, as he reflects on the legacy he’s leaving behind.
“You come into academia trying to solve the world’s problems and work out how the brain works,” he says, hands clasped over crossed legs. “Ten years into it you say, ‘Well I’ll just get my next grant for my next paper, because I want to be a lecturer or senior lecturer.’ It’s only when you come out the other end that you say, ‘Where’s my life gone?’”
He came into this field to start a conversation about why using data to sort and organize people could end up tearing them apart. As frustrating as it’s been to be cast as a villain by some of the most powerful people in the world, he’s thankful this long-awaited discussion around data privacy has finally begun.
“We’re at a point where it could go in so many different directions. It could be a big brother, Brave New World combination where a group of individuals can completely control and predict the behavior of every single individual. Or we have to develop some regulatory system that allows these newly created beings to evolve along with us,” he says. “If anything we’ve done has influenced that, it will have made it worthwhile.”
1Update: 10:23 AM ET 06/19/2018 This story has been updated to clarify that a subset of Stillwell and Kosinski’s data was exposed, not the entire database.
More Great WIRED Stories
- How Facebook groups became a bizarre bazaar for elephant tusks
- Pulling water out of air? Grab some ions or a weird sponge
- Larry Page’s flying car project suddenly seems rather real
- Encyclopædia Britannica wants to fix false Google results
- PHOTO ESSAY: The trailblazing women who fight California’s fires
- Hungry for even more deep-dives on your next favorite topic? Sign up for the Backchannel newsletter