Skip to main contentSkip to navigationSkip to navigation
Silhouettes of laptop users are seen next to a screen projection of Facebook logo
One member of the panel said the Facebook users involved had not given sufficient consent to allow the research to be conducted. Photograph: Dado Ruvic/Reuters
One member of the panel said the Facebook users involved had not given sufficient consent to allow the research to be conducted. Photograph: Dado Ruvic/Reuters

Cambridge University rejected Facebook study over 'deceptive' privacy standards

This article is more than 6 years old

Exclusive: panel told researcher Aleksandr Kogan that Facebook’s approach fell ‘far below ethical expectations’

A Cambridge University ethics panel rejected research by the academic at the centre of the Facebook data harvesting scandal over the social network’s “deceptive” approach to its users privacy, newly released documents reveal.

A 2015 proposal by Aleksandr Kogan, a member of the university’s psychology department, involved the personal data from 250,000 Facebook users and their 54 million friends that he had already gleaned via a personality quiz app in a commercial project funded by SCL, the parent company of Cambridge Analytica.

Separately, Kogan proposed an academic investigation on how Facebook likes are linked to “personality traits, socioeconomic status and physical environments”, according to an ethics application about the project released to the Guardian in response to a freedom of information request.

The documents shed new light on suggestions from the Facebook CEO, Mark Zuckerberg, that the university’s controls on research did not meet Facebook’s own standards. In testimony to the US Congress earlier this month, Zuckerberg said he was concerned over Cambridge’s approach, telling a hearing: “What we do need to understand is whether there is something bad going on at Cambridge University overall, that will require a stronger action from us.”

But in the newly published material, the university’s psychology research ethics committee says it found the Kogan proposal so “worrisome” that it took the “very rare” decision to reject the project.

The panel said Facebook’s approach to consent “falls far below the ethical expectations of the university”.

Allow Scribd content?

This article includes content provided by Scribd. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'.

Correspondence around the decision was released hours before Kogan appeared before a Commons inquiry into fake news and misinformation. In written and oral evidence to the committee Kogan insisted that all his academic work was reviewed and approved by the university (pdf). But he did not mention to the MPs the ethics committee’s rejection of his proposed research using the Facebook data in May 2015.

Explaining the decision, one member of the panel said the Facebook users involved had not given sufficient consent to allow the research to be conducted, or given a chance to withdraw from the project. The academic, whose name was redacted from the document, said: “Facebook’s privacy policy is not sufficient to address my concerns.”

Appealing against the panel’s rejection, a letter believed to be written by Kogan pointed out that “users’ social network data is already downloaded and used without their direct consent by thousands of companies who develop apps for Facebook”.

It added: “In fact, access to data by third parties for various purposes is fundamental to every app on Facebook; so users have already had their data downloaded and used by companies for private interest.”

Another panel member felt that information shared with Facebook friends should not be regarded as public data. In a response to Kogan’s appeal, the academic said: “Once you have persuaded someone to run your Facebook app, you can really only collect that subject’s data. What his or her friends have disclosed is by default disclosed to ‘friends’ only, that is, with an expectation of confidence.”

The ethics panel member added: “Facebook is rather deceptive on this and creates the appearance of a cosy and confidential peer group environment, as a means of gulling users into disclosing private information that they then sell to advertisers, but this doesn’t make it right to an ethical researcher to follow their lead.”

The academic also likened Facebook to a contagion. The letter sent in July 2015 said: “My view of Facebook is that it’s a bit like an infectious disease; you end up catching what your friends have. If there are bad apps, or malware, or even dodgy marketing offers, they get passed along through friendship networks. An ethical approach to using social networking data has to take account of this.”

Kogan accepted that he made mistakes in how the Facebook data was collected. He told Tuesday’s committee hearing: “Fundamentally I made a mistake, by not being critical about this. I should have got better advice on what is and isn’t appropriate.”

But asked if he accepted that he broke Facebook’s terms and conditions, Kogan said: “I do not. I would agree that my actions were inconsistent with the language of these documents, but that’s slightly different.”

Kogan collected Facebook data before the network changed its terms of service in 2014 to stop developers harvesting data via apps.

Facebook has banned Kogan from the network and insisted that he violated its platform policy by transferring data his app collected to Cambridge Analytica. The company has previously said it is “strongly committed to protecting people’s information”.

In an initial statement on Kogan’s research, Mark Zuckerberg said Facebook had already taken key steps to secure users’ data and said it would go further to prevent abuse. He added: “This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it.”

Kogan said the ethics committee raised a “number of issues” when it rejected his proposal. In an email to Guardian he added: “We were in the process of resubmitting a new application to the university having addressed these issues – a process that was ultimately stopped once Facebook asked us to delete all of the data.”

Most viewed

Most viewed