Report comes after four years of pressure and threats of legal action
UK police officers may in future be equipped with automatic facial recognition (AFR) technology on their smartphones, the Home Office said this week, publishing a biometrics report after four years of parliamentary pressure.
The release comes after a Parliamentary committee last month warned of “urgent need for action” on government facial recognition oversight, demanding the strategy promised four years ago be published “without delay”.
“We will consider enabling access to facial image collections at custody suites and on police mobile devices to help identify or verify identities for wider law enforcement purposes,” the report, published Thursday, highlighted.
On the HOB
The 27-page report comes after planned IT upgrades could see police gain access to the images of 90 percent of the population, as legacy police IT systems are replaced by a centralised Home Office Biometrics Programme (HOB) that will enable much wider use of facial recognition, as IT silos are removed.
Reacting to the report, Norman Lamb MP, Chair of the Science and Technology Committee, said: “A 27-page document simply does not do justice to the critical issues involved. Specific issues relating to the way facial images are being collected and retained by the police have not been properly addressed. The ‘Strategy’ seems to boil down to setting up an advisory ‘board’ to suggest policy recommendations.”
He added: “The Government’s decision to launch a 12-month consultation on strengthening governance structures for biometrics smacks of continuing to kick the can down the road. It seems that we may have to wait a fifth year before a proper strategy is produced, which is simply not good enough.”
The report comes after Information Commissioner Elizabeth Denham threatened the Home Office with legal action over the growing use of facial recognition.
She wrote: “How facial recognition technology is used in public spaces can be particularly intrusive… police forces must have clear evidence to demonstrate that the use of [facial recognition technology] in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem.”
She added: “Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public.”
In this week’s report, the government said: “We recognise that the governance and oversight of these applications and the use of facial images as a biometric by law enforcement could be strengthened further”.
The Home Office said it will “establish a new oversight and advisory board to coordinate consideration of law enforcement’s use of facial images and facial recognition systems” and “undertake Data Protection Impact Assessments (DPIAs) prior to the use of a new biometric technology or a new application of an existing biometric
technology, inviting scrutiny from an independent ethics panel, regulators and
Roll out the AFR
“In future, HOB will provide a common facial matching service enabling the Home
Office to realise efficiencies and ensure a more consistent approach to the
testing, access controls and privacy protections associated with it,” the Home Office said in the report, pointing to much wider use of AFR.
“This will allow improvements in the technology and matching algorithms to enhance processes at Ports of Entry, Visa Application Centres and within passport applications.
Looking further ahead, we will consider the use of AFR for verifying identity and
identifying known criminals of interest. We will run proof of concept trials to
develop this work, including at the UK border.”
“Don’t Be a Luddite”
James Wickes, CEO at Cloudview told Computer Business Review: “The debate about the use of new technologies such as Automatic Facial Recognition (AFR) has been dominated by the unsubstantiated claims of pressure groups such as Liberty and Big Brother Watch.”
“Nowhere in the debate do we hear the argument that new AI and machine learning technologies have a vital role to play in drastically improving the efficiency, usefulness and, most importantly, the increased privacy of visual data.”
He added: “The recently published Home Office Biometrics Strategy, whilst well intended, does nothing to counterbalance the debate as it lacks detail on future laws or the standards that should be applied to new technologies such as AFR. Taking a luddite stance against new AI and machine learning technologies is not and never will be the answer. The UK needs to be in the driving seat of these new technologies so that we understand them and can drive international standards for their design and use.”
“Expect Court Cases”
Stefano Ruis, partner at leading criminal law firm Hickman and Rose, told Computer Business Review: “If the police continue to use facial recognition in the absence of any legislation, they can expect numerous legal challenges. Indeed, Big Brother Watch has launched a threatened judicial review challenging the legality of the scheme.”
“Beyond that, the police may face challenges by individuals who have had their image captured while going about their daily lives. The strongest ground for such challenges would be that the interference with an individual’s right to privacy under Article 8 of the ECHR. The indiscriminate way in which the technology is used without consent remains disproportionate. The fact that it does not appear to work properly only adds to the concern that its use in identifying and catching suspects is limited.”