The last two weeks have seen facial recognition systems hit the headlines again for being used without the public’s knowledge or consent. Last Monday, the Financial Times reported that the privately owned King’s Cross development, which is home to public spaces like shops, offices and schools, is using facial recognition software to “ensure public safety”. This prompted both the Information Commissioner, Elizabeth Denham, and the Mayor of London, Sadiq Khan, to make inquiries into the legal basis of the software.

This isn’t the first time facial recognition has been in the public eye. Since 2017, the covert deployment of facial recognition surveillance systems in public spaces have popped up across the UK, from secret police trials in Sheffield’s Meadowhall shopping centre to the scanning of visitors' faces during an exhibition in Liverpool’s World Museum. UK police forces (specifically the London Met and South Wales police) have come under increasing criticism for trialling highly ineffective facial recognition software, for retaining facial data on more than 20 million innocent people on a watch list, and more recently for trialling software that scans faces in real-time.

Why should I care about facial recognition?

In the midst of a 15% drop in police officers on our streets, using facial recognition software might seem like an added tool in the box for safe and secure communities, but it also has chilling implications on our freedoms. Imagine you are walking through a shopping centre on a Saturday morning, a Live Facial Recognition (LFR) camera installed in the area streams an image of you smoking outside Primark to the LFR system database. Once it has detected your face, it measures the distance between your nose, eyes, mouth and jaw to create a biometric map – a digital copy of your unique face. This ‘face-print’ will then be cross-examined against an existing database of digital faces of people who are wanted by the police.

Using facial recognition technology for surveillance in public spaces presents several dangers to our human rights. The indiscriminate scanning and storage of our personal biometric data while we all go about our daily lives is a disproportionate policing measure that violates our right to privacy (Article 8). Knowing that we are being watched in public spaces may lead us to change our behaviour, we might not attend protests or express our feelings in the same way; this infringes on our freedom of expression (Article 10). Not to mention, the technology can be highly inaccurate, struggling to identify women and people of colour. And what happens to your ‘face-print’ after it’s been used? There is no Home Office policy covering facial recognition and there is no law giving police the power to use it, leaving many questions like data protection unanswered. That is problematic especially when researchers have found that one company, used by the UK Met Police to store our personal data, uses publicly accessible databases that can be altered by anyone.

How do our human rights protect us?

In the face of this danger to our freedoms, the laws protecting our human rights are more important than ever. The Human Rights Act ensures that our right to a private and family life (Article 8) is respected by public authorities, including the police. This means that any interference with your privacy must comply with UK law and have a valid purpose, meeting a social need in a proportionate way.

The European Court of Human Rights has already ruled in Perry v the UK 2003 that non-consensual video surveillance without specific cause or explanation violates our Article 8 rights to respect of our private life. They also ruled in Liberty and others v the UK 2008 that the British Ministry of Defence’s interception of confidential emails and telephone correspondence was unlawful and violated Article 8 rights. Firstly, because the domestic law wasn’t sufficiently clear about the scope and manner of intercepting and examining correspondence. Secondly, there were no details available to the public regarding the procedure for selecting, sharing, storing and destroying intercepted communication.

The European Court of Human Rights also ruled in Taylor-Sabori v the UK 2002 that the police’s interception of a person’s pager messages was unlawful because there was no statutory system to regulate the interception of pager messages. This all sounds very similar to facial recognition – that’s why Ed Bridges, a university officer, has brought legal proceedings against South Wales Police. In May 2019, the case for banning facial recognition was made at a three-day hearing at a High Court in Cardiff. It was argued that the technology violated Article 8 rights, data protection laws (Data Protection Act 2018) and the Equality Act 2010 (the police didn’t consider that the technology could lead to the disproportionate stop and search of women and people of colour). The outcome of the case in late autumn will affect the legal proceedings of Big Brother Watch against London’s Met Police on the same issue.

Facial recognition isn’t the only surveillance technique on the hot seat. Our human rights continue to protect our right to privacy and freedom of expression despite the UK government’s continued mass surveillance through new technologies. In 2013, Edward Snowden revealed that the UK’s GCHQ were secretly intercepting, processing and sharing the private communications of millions of ordinary people on a daily basis to foreign intelligence services without a clear legal foundation or proper safeguards. Since then, human rights organisations like Amnesty International, Liberty and Privacy International have taken the government to court. In September 2018, the lower chamber of the European Court of Human Rights ruled in Big Brother v the UK that the UK’s communication interception regime was unlawful. This was because the regime failed to have adequate safeguards and it indiscriminately intercepted communications without a specific target, so violated our rights to privacy and free expression. In search of a definitive judgement, the case was heard in Europe’s highest human rights court in July 2019. This particular case is one of many challenges to the UK government’s mass surveillance techniques; the decisive outcomes of which are yet to be finalised. 

A human-rights-compliant approach to surveillance

What’s clear in this latest episode of the continuing mass surveillance saga is that our human rights, enshrined in UK law, are timeless. Any future legal framework for facial recognition or any government strategy for the ethics and governance of biometric data requires a human rights approach. The backlash from the public, human rights organisations, police forces across the UK and Parliamentary committees and working groups indicate that the Government should issue a moratorium on the issue and await a human-rights-compliant strategy and legal framework which adequately safeguards our rights and freedoms.