The Guardian view on data protection: a vital check on power | Editorial

The UK government plans to harmonise our data protection laws with the EU’s. This is necessary and sensible, too

Data is knowledge and knowledge is power. That is why data protection matters in a democracy. The most recent government paper, a statement of intent, is not the detailed legislation that will be needed to harmonise British law with the EU’s General Data Protection Regulation [GDPR], which comes into force next spring, but it gives a clear view of what the government is trying to achieve. The overwhelming aim is to remain in step with the EU. So much of a modern economy depends on the frictionless movement of vast quantities of data across national borders that it is vital to harmonise with EU policy even if we can now no longer help to shape it.

There are three different interests in data and privacy which have to be brought into balance: the individual, the companies which hold and process our data, and the state, which mediates between the two. There is a fundamental asymmetry between the individual and the other two, in that the personal data of any particular customer is worth in isolation very little to anyone else, but the aggregation and refinement of data gives it a huge new value. It should be the aim of policy to ensure that no one is disadvantaged by having their data processed in this way.

Anonymity is not the shield it might appear. Someone who knows everything about you but your name is in possession of information far more valuable, and potentially dangerous, than someone who knows your name and nothing else. Names can be trivial to discover, given other facts. One of the central premises of the information economy is that the collection and analysis of gigantic quantities of anonymised data produces general patterns which enable accurate prediction about anonymised individuals. The correlations that emerge from vast quantities of data hold good even when tiny samples are examined. This is the insight at the heart of “machine learning”, one of the most promising fields of artificial intelligence.

The GDPR, and hence the statement of intent, takes aim at this in two ways. The first is the right of any individual to know what data is held on them, and in some circumstances to demand its deletion. This is obviously a help against teenage indiscretions, but it is not, nor should it be, a general panacea. There is a genuine public interest in knowing things about public figures that they would rather conceal. The partial exemption of journalism from data protection rules is a welcome part of this statement of intent. In any case, the right to be forgotten isn’t absolute. It is really a right to remove facts from search engines, not from the web itself. It does nothing to diminish the powers of inference from known facts which are harmless in themselves to unknown dangerous truths that we would rather conceal.

The second is the right of appeal to a human being against decisions which have been taken by an algorithm. This is something very different from requiring that the workings of the algorithm in question be explained. That would almost certainly be impossible: some forms of artificial intelligence now reach conclusions by a process that even their programmers cannot debug. But computers don’t operate themselves. They are programmed and maintained by human actors who must be held responsible for their actions, and that is what the stipulation about algorithmic decisions amounts to.

But there is in the end a limit to what state action can accomplish. We must all learn to negotiate a world where machines watch almost everything we do, or read, or write, and soon what we say in our homes, and harvest data out of all of it. Anonymity is not the answer; in fact it can conflict with rights over personal data, since these rights can only be properly exercised by someone known to be entitled to them. But caution, discretion, and strong encryption are available to everyone. Make use of them.

Contributor

Editorial

The GuardianTramp

Related Content

Article image
Artificial intelligence can’t save us from human stupidity | Editorial
The world in 2019: The vexed politics of our times has obscured the view ahead. Over the holidays we have been examining some big issues on the horizon. Today, in our final instalment, we look at the spread of artificial intelligence

Editorial

01, Jan, 2019 @11:29 AM

Article image
The Guardian view on data protection: privacy is not enough | Editorial
Editorial: Rights over our personal data are only a start. The imbalances of power online must be tackled in other ways too

Editorial

09, Nov, 2017 @7:25 PM

Article image
The Guardian view on AI in the NHS: a good servant, when it’s not a bad master | Editorial
Editorial: The NHS collects vast amounts of data. It must be used in imaginative ways that respect privacy and make life better for patients and health workers

Editorial

21, May, 2018 @5:42 PM

Article image
The Guardian view on artificial intelligence: not a technological problem | Editorial
Editorial: The dream of a computer system with godlike powers and the wisdom to use them well is a theological construct

Editorial

16, Apr, 2018 @5:30 PM

Article image
The Guardian view on killer robots: on the loose | Editorial
Editorial: Lethal autonomous weapons are a reality, but the campaign to prevent their use is ours to win

Editorial

29, Aug, 2017 @6:40 PM

Article image
The Guardian view on machine learning: people must decide | Editorial
Editorial: Each advance in artificial intelligence increases the power of computer networks, but the responsibility for their use remains with human beings

Editorial

23, Oct, 2016 @6:37 PM

Article image
The Guardian view on artificial intelligence: look out, it’s ahead of you | Editorial
Editorial: There is a tendency to see intelligence where it does not exist. But it is just as wrong to fail to see where it is emerging

Editorial

08, May, 2016 @7:41 PM

Article image
The Guardian view on computers and language: reproducing bias | Editorial
Editorial: The English language is full of value judgments. These are taken over by the computer algorithms that use it. What can we do about these unconscious biases?

Editorial

14, Apr, 2017 @4:49 PM

Article image
The Guardian view on a Swedish scandal: the precedence of privacy | Editorial
Editorial: Governments forget at their peril that they must nowadays guard their citizens’ data as carefully as they guard their physical safety

Editorial

31, Jul, 2017 @6:34 PM

Article image
The Guardian view on AI in the NHS: not the revolution you are looking for | Editorial
Editorial: Computer systems may not replace doctors or nurses. But even to replace support staff would be a huge change

Editorial

06, Jan, 2017 @7:02 PM