Planned centralisation of biometrics could give access to 90% of population for facial recognition. Information Commissioner threatens to sue Home Office
A Parliamentary committee has warned of “urgent need for action” on government facial recognition oversight, demanding a biometrics strategy promised four years ago be published “without delay”.
It comes after planned IT upgrades could see police gain access to the images of 90 percent of the population, as legacy police IT systems are replaced by a centralised Home Office Biometrics Programme (HOB).
“The IT replacement programme would allow police officers to be able to access driving licenses images, and consideration was being given to whether passport images could also be accessed,” the House of Commons Science and Technology Committee said this morning.
The report comes after the UK’s Surveillance Camera Commissioner Tony Porter told Parliament earlier this year that the £2.2 billion annually was being spent across the UK on surveillance cameras: “The ability of the State and indeed the commercial sector to physically and intrusively track the citizen in public spaces is well and truly upon us”, he said.
Tech giants are keen to get a piece of the pie, as technology improves: Amazon is actively courting law-enforcement agencies to use a AWS-based facial-recognition service that can identify people in real time, the American Civil Liberties Union reported Tuesday, citing the documents obtained from two US departments.
A 2012 High Court ruling warned holding images of innocent people is disproportionate and likely unlawful under Article 8 of the European Convention on Human Rights.
Police in the UK have already accessed facial images of more than 20 million people. These are being matched against wanted lists to automatically scan crowds at events like Notting Hill Carnival and the UEFA cup final, using a combination of the country’s CCTV network – the world’s most extensive – and mobile police cameras.
This month Information Commissioner Elizabeth Denham threatened the Home Office with legal action over the growing use of facial recognition.
She wrote: “How facial recognition technology is used in public spaces can be particularly intrusive… police forces must have clear evidence to demonstrate that the use of [facial recognition technology] in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem.”
She added: “Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public.”
The committee on these grounds blasted the failure of Home Office and police IT systems to support automatic image deletion by those requesting it: “This reflects current weaknesses in IT systems and a concern about the potential cost of a comprehensive manual deletion process.”
The committee was told that the report was “in the final stages of completion and will be published shortly” in November 2016. It was first promised a report four years ago.
Demanding a report be published in June and calling for the planned IT upgrades to be “delivered without delay”, the committee said: “The Government’s approach is unacceptable because unconvicted individuals may not know that they can apply for their images to be deleted, and because those whose image has been taken should not have less protection than those whose DNA or fingerprints have been taken.”
“There are important ethical issues involved in the collection, use and retention of facial images in particular because they can easily be taken and stored without the subject’s knowledge and because various image databases already include 90% of the adult population between them,” the report emphasised.
Police and the Home Office see facial recognition as a vital tool in their armoury amid terrorist attacks and a heightened threat environment, currently said by MI5 to be “severe”. Critics however have warned that biometrics technology is not yet accurate enough to use, is racially biased and unable to even differentiate men from women.