Skip to main contentSkip to navigationSkip to navigation
Man with smartphone
‘Most Britons – 70% – now carry around with them smartphones which record and report their location, their friends and their interests all the time.’ Photograph: Alamy Stock Photo
‘Most Britons – 70% – now carry around with them smartphones which record and report their location, their friends and their interests all the time.’ Photograph: Alamy Stock Photo

The Guardian view on internet privacy: it’s the psychology, stupid

This article is more than 6 years old

The ease with which giant databases can be queried and cross-referenced makes privacy vanish on the internet

Privacy is necessary for human society to function. The problem is not that the information exists but that it reaches the wrong people. Information on the internet could bring great benefits to society, and to individuals, when huge datasets can be refined to yield information otherwise unavailable. But once the information is gathered, a precautionary principle has to apply. It is too much of a stretch to agree with John Perry Barlow, the internet rights pioneer who died this week, when he quipped that “relying on the government to protect your privacy is like asking a peeping tom to install your window blinds”; but it does not help when it appears that everything the public sector does with the huge datasets it has will be overseen by the minister for fun.

Governments need to keep our trust; but technology erodes privacy in two ways. The first is simply smartphones. Most Britons – 70% – now carry around with them devices which record and report their location, their friends and their interests all the time. The second is the ease with which two (or more) datasets can be combined to bring out secrets that are apparent in neither set on its own, and to identify individuals from data that appears to be entirely anonymised. By the beginning of this century researchers had established that nearly 90% of the US population could be uniquely identified simply by combining their gender, their date of birth and their postal code. All kinds of things can be reliably inferred from freely available data: four likes on Facebook are usually enough to reveal a person’s sexual orientation.

Underlying such problems is human psychology. No one forces anybody to reveal their preferences on Facebook: the like button is genuinely popular. The latest spectacular breach of privacy came when the exercise app Strava published a global map of the 3 trillion data points its users had uploaded, which turned out to reveal the location of hitherto secret US military bases around the world. But the chance to boast about where you have been and how fast you were moving is exactly what makes Strava popular. Psychology, as much as technology, made this a massive security breach. The users gave enthusiastic consent, but it was fantastically ill-informed. Then again, how could anyone give informed consent when not even the firms that collect the data can know how it will be used?

The protection of private data from malevolent hackers is a technical arms race one cannot leave. But the protection of privacies from inadvertent disclosure is primarily a social or psychological problem. The solution cannot just be one of informed consent from the data providers, because in most situations no one has the information necessary to give their consent. What’s needed instead is a change of attitude among those who harvest and process the data. They need constantly to ask themselves – or to be asked by society – how this information could be used for harm, and how to prevent that from happening.

Most viewed

Most viewed