Skip to main contentSkip to navigationSkip to navigation
Ever feel like you are being watched? A surveillance expert studies images from Edinburgh city council’s CCTV cameras
Ever feel like you are being watched? A surveillance expert studies images from Edinburgh city council’s CCTV cameras. Photograph: David Moir/Reuters
Ever feel like you are being watched? A surveillance expert studies images from Edinburgh city council’s CCTV cameras. Photograph: David Moir/Reuters

When local councils use data tools to classify us, what price freedom?

This article is more than 4 years old
Kenan Malik
From predictions of crime to identifying endangered children, we are in thrall to tech

Are you a “metro high-flyer” or part of an “alpha family”? A “midlife renter” or a “cafes and catchments” sort? An “estate veteran” or a “bus-route renter”? You may not know, but if you live in Nottingham or Kent, your local council certainly does. And if you’re from Durham, so does the bobby on the beat.

These labels are part of 66 classifications of Britons devised for Mosaic, a system created by the credit score company Experian. Mosaic, according to Experian, is constructed out of “850m pieces of information” and allows you to “peer inside… all the types of household” in any town or village, “with their life-stages, marital status, household compositions and financial positions”.

Mosaic is designed as a marketing tool for private companies, to help “identify the consumers most responsive to different direct marketing channels”. It’s become a tool for local councils, too. According to a study from Cardiff University’s Data Justice Lab published last month, Kent and Nottinghamshire county councils and Blaby district council use Mosaic. At least 53 local authorities are purchasing data systems from private companies to help classify citizens and predict future outcomes, on everything from which child might be in danger of abuse to who might be committing benefit fraud.

Police forces, too, are increasingly turning to data mining and algorithms “to predict where and when crimes will happen – and even who will commit them”, according to a report from the civil rights group Liberty. Durham police, for instance, uses a programme called the Harm Assessment Risk Tool to decide the likelihood that a particular individual will commit a crime over the next two years. The algorithm bases its predictions on 34 pieces of data, supplemented with information about the “demographic characteristics” of the local population from Experian’s Mosaic database.

Predictive policing techniques, Liberty observes, reproduces and magnifies “the same patterns of discrimination that policing has historically reflected”. Yet the fact that decision-making is now filtered through algorithms lends “unwarranted legitimacy to biased policing strategies”.

One reason for public authorities turning to such techniques is the impact of public sector funding cuts. Data analytics, as the Cardiff researchers note, is increasingly sold as a means of saving money and helping better target resources. The new data techniques speak also to a desire to impose bureaucratic rationality on our messy lives. Just as at the beginning of the last century Henry Ford rationalised work through the creation of a new form of production line, now data evangelists seek to iron out the vagaries of our lives by reducing them to data chunks.

Data evangelism surfs a number of recent social developments. More atomised societies smooth the way to viewing individuals as collections of data. The contemporary obsession with identity makes it easier to view every citizen as existing in a Mosaic-like category.

And our desire for “frictionless” lives has led us to stumble, almost without realising, into a new kind of surveillance society. We all worry about privacy and are outraged when Facebook or the NHS suffers data leaks. And yet we constantly trade away our data without thinking about it. From “smart doorbells” that link to “databases of suspicious persons” to genealogy companies that we trust with our DNA, opening their databases for police inspection, we are creating a world in which surveillance seems as inescapable as gossip about Love Island or another Donald Trump Twitter rant. A surveillance state created not just by government fiat, as in China, but also by our own absence of mind and thought.

In his 1975 book Discipline and Punish, the French philosopher Michel Foucault reworked the idea of the panopticon, first suggested by the 18th-century English philosopher Jeremy Bentham. It was, for Bentham, an institution in which every inhabitant could be watched by a single observer in a central tower. Every individual had to assume that they were being watched at any one time and so had to be on their best behaviour constantly. Rather than violent discipline, Bentham argued, continual observation could act as a mechanism of social control, not just in prisons but hospitals, schools and factories too.

For Foucault, all social institutions took the form of a panopticon. Modern society, he wrote, requires for its functioning “permanent, exhaustive, omnipresent surveillance, capable of making all visible, as long as it could itself remain invisible”, a system resting on “thousands of eyes posted everywhere”. He might have been talking of the contemporary surveillance state.

Kenan Malik is an Observer columnist

Most viewed

Most viewed