Skip to main contentSkip to navigationSkip to navigation
Lehel Kovacs illustration
Illustration: Lehel Kovacs
Illustration: Lehel Kovacs

These new rules were meant to protect our privacy. They don’t work

This article is more than 4 years old
Stephanie Hare
The data protection laws introduced last year are failing us – and our children

Who owns your data? This is one of the toughest questions facing governments, companies and regulators today and no one has answered it to anyone’s satisfaction. Not what we were promised last year, when the European Union’s General Data Protection Regulation, commonly known as the GDPR, came into effect.

The GDPR was billed as the gold standard of data protection, offering the strongest data rights in the world. It has forced companies everywhere to modify their operating models, often at great cost. It inspired the state of California to pass a similar law and where California leads, the rest of the US often follows; there have been calls for a federal version of the GDPR.

Yet for those of us living under the GDPR, what has really changed?

Before it came into effect last year, we faced an onslaught of emails from organisations asking if we were happy to continue a relationship most of us never knew we were in, or if we wanted them to delete our data and unsubscribe us from their data gathering.

While it was an opportunity for a digital spring clean, informing people that their data is being collected is not the same as preventing it from being collected in the first place. That continues and is even increasing. The only difference is that now we are forced to participate in our own privacy violation in a grotesque game of “consent”.

Most websites nudge us into clicking “I consent” by making it harder for us not to. Those that do offer an “I do not consent” option force us to navigate a complicated menu of privacy settings, all of which offer only the veneer of privacy.

They know that no one has the time or the inclination to do this for every website and they are betting that most of us will choose convenience over data protection. And so we click “I consent” to cookies and other web trackers that follow us around, creating an ever-growing digital self that is monitored, used, bought and sold.

Under the GDPR, we gained the right to find out what data is held on us and to request its deletion. Again, this puts the onus on us, not the companies or the government, to do the work. Again, most of us don’t. Yet the GDPR could have solved this easily by making privacy the default and requiring us to opt in if we want to have our data collected. But this would hurt the ability of governments and companies to know about us and predict and manipulate our behaviour, as Shoshana Zuboff demonstrated powerfully in her book, The Age of Surveillance Capitalism.

It grows harder to shrug this off when our own parliamentary joint committee on human rights (JCHR) warned last week that data is already being used to discriminate in housing and job ads online. It notes that it is “difficult, if not nearly impossible, for people – even tech experts – to find out who their data has been shared with, to stop it being shared or to delete inaccurate information about themselves”. And the JCHR says that it is “completely inappropriate to use consent when processing children’s data”, noting that children aged 13 and older are, under the current legal framework, considered old enough to consent to their data being used.

The GDPR was supposed to prevent all of this. It is failing us. And it is failing our children.

Nor is the GDPR stopping the construction of a surveillance society – in fact, it may even legalise it. The collection of biometric data, which occurs with facial recognition technology, is prohibited under the GDPR unless citizens give their explicit consent. Yet there are exceptions when it is in the public interest, such as fighting crime.

This is how an exception becomes the rule. After all, who doesn’t want to fight crime? And since the security services and police can use it, many companies and property owners use it too.

Amid signs of a growing backlash, the GDPR offers little help and even less consistency. In August, Sweden’s data regulator fined a high school for using facial recognition to register student attendance, but did not rule it illegal. France’s regulator ruled last month that it is illegal to use facial recognition in secondary schools, but it has not challenged the government’s plan to use facial recognition for a compulsory national digital identity programme. A UK court upheld the use of facial recognition by South Wales police this autumn, but the main data regulator, the Information Commissioner’s Office (ICO), warned last month that this should not be taken as a blanket permission for the police to use the technology.

Meanwhile, the House of Lords has introduced a bill calling for a moratorium on the automated use of facial recognition, something the science and technology committee in the House of Commons called for in July. Even the European commission admits that the GDPR is failing to protect us from a surveillance society, which is why it too is planning regulation on facial recognition technology as part of its new strategy for artificial intelligence.

This change of course cannot come fast enough. But it must go much further. The next generation of wireless telecommunications infrastructure, known as 5G, is beginning to transform the promise of the internet of things into a reality. It will turn our wearable devices, homes, cars, workplaces, schools and cities into a never-ending stream of connected data. Advances in computing processing power and AI will allow those who have our data to do much more with it and so with us.

Yet even as the question of who owns our data becomes more urgent, ownership may not be the best way to think about what is really a question of how to protect our civil liberties in the age of AI.

In Permanent Record, Edward Snowden explains that it was his close study of the US constitution, specifically the Bill of Rights, which persuaded him that Americans’ civil liberties were being violated by the US government’s mass surveillance activities, which were carried out with and without the active participation of US technology companies. And even though non-US citizens are not protected by the Bill of Rights, Snowden believed that the US government was violating their human rights. This is what drove him to blow the whistle in 2013.

Last week, Snowden said that the GDPR is “a good first effort… but it’s not a solution”. He thinks that legislation should address the collection of our data, not its protection after it is collected. To do that, we will need to overhaul our approach. The GDPR protects data. To protect people, we need a bill of rights, one that protects our civil liberties in the age of AI.

Stephanie Hare is an independent researcher and broadcaster

More on this story

More on this story

  • Nuisance calls could lead to multimillion-pound fines in UK

  • What is GDPR and why does the UK want to reshape its data laws?

  • UK to overhaul privacy rules in post-Brexit departure from GDPR

  • EU rules UK data protection is ‘adequate’ in boost for business

  • The background to EU citizens' court win over US tech giants

  • Tech firms like Facebook must restrict data sent from EU to US, court rules

  • Britain could lose access to EU data after series of scandals

  • Marriott to be fined nearly £100m over GDPR breach

Most viewed

Most viewed