Uk Secretly Allows Facial Recognition Scans Of Passport, Immigration Databases

updated Privacy groups report a surge in UK police facial recognition scans of databases secretly stocked with passport photos lacking parliamentary oversight.

Big Brother Watch says the UK government has allowed images from the country’s passport and immigration databases to be made available to facial recognition systems, without informing the public or parliament.

team fail

Home Office slams PNC tech team: ‘Inadequate testing’ of new code contributed to loss of 413,000 records

READ MORE

The group claims the passport database contains around 58 million headshots of Brits, plus a further 92 million made available from sources such as the immigration database, visa applications, and more.

By way of comparison, the Police National Database contains circa 20 million photos of those who have been arrested by, or are at least of interest to, the police.

In a joint statement, Big Brother Watch, its director Silkie Carlo, Privacy International, and its senior technologist Nuno Guerreiro de Sousa, described the databases and lack of transparency as “Orwellian.” They have also written to both the Home Office and the Metropolitan Police, calling for a ban on the practice. 

The comments come after Big Brother Watch submitted Freedom of Information requests, which revealed a significant uptick in police scanning the databases in question as part of the force’s increasing facial recognition use.

The number of searches by 31 police forces against the passport databases rose from two in 2020 to 417 by 2023, and scans using the immigration database photos rose from 16 in 2023 to 102 the following year.

Carlo said: “This astonishing revelation shows both our privacy and democracy are at risk from secretive AI policing, and that members of the public are now subject to the inevitable risk of misidentifications and injustice. Police officers can secretly take photos from protests, social media, or indeed anywhere and seek to identify members of the public without suspecting us of having committed any crime.

“This is a historic breach of the right to privacy in Britain that must end. We’ve taken this legal action to defend the rights of tens of millions of innocent people in Britain.”

Maligned technology

It’s no secret that UK police have steadily increased its use of facial recognition technology in recent years, despite the ardent pushback from the pro-privacy crowd.

There are three types of facial recognition (FR) tech used across the UK: retrospective FR, live FR, and operator-initiated FR.

RFR and OIFR are generally seen as the less intrusive uses of the technology, wheeled out only when officers are aware that a crime has been committed and used to scan images of specific people of interest.

LFR is different in that it involves setting up a camera in a location and it scans every face it captures, which means the vast majority of its subjects will be innocent people.

The Home Office insisted in its LFR factsheet, which has not been updated since 2023, that LFR deployments are targeted, intelligence-led, time-bound, and geographically limited. The Home Office also told The Register that passport and immigration databases are used only for RFR, not LFR, and the police must request access from the Home Office before being allowed into the passport database.

Efforts are made to inform the public when a camera is due to be set up in any given location, and the government said it has been used to successfully arrest wanted sex offenders in densely populated crowds, as well as other violent offenders.

These kinds of examples are often used by the government to validate its controversies, just in the same way it uses the threat of terrorists and child sexual abuse offenders to justify its anti-encryption agenda and Investigatory Powers Act.

The technology is pitched as a way to make the normal jobs of police officers more efficient, freeing their time to do other things. Officers are often briefed daily with images belonging to people of interest – LFR just lightens the load of this manual scouting, the Home Office says.

Whichever way the government positions LFR, it doesn’t appear to be allaying the concerns held by many about the way it is used.

Authorities insist the accuracy is increasing, and the racial biases and false positives generated by FR scans are decreasing, despite early rollouts being plagued by such issues.

And the Home Office’s claim that the technology is time-bound also no longer holds true, after the announcement earlier this year that the UK’s first permanent LFR camera will be installed in Croydon, South London.

Recent data from the Met attempted to imbue a sense of confidence in facial recognition, as the number of arrests the technology facilitated passed the 1,000 mark, the force said in July. 

However, privacy campaigners were quick to point out that this accounted for just 0.15 percent of the total arrests in London since 2020. They suggested that despite the shiny 1,000 number, this did not represent a valuable return on investment in the tech.

Alas, the UK has not given up on its pursuit of greater surveillance powers. Prime Minister Keir Starmer, a former human rights lawyer, is a big fan of FR, having said last year that it was the answer to preventing future riots like the ones that broke out across the UK last year following the Southport murders. ®

Updated Aug 8 at 1830 GMT to reflect the Home Office’s explanation that it uses passport and immigration databases only for RFR, not LFR, and that police must request access to the passport database.


Original Source


A considerable amount of time and effort goes into maintaining this website, creating backend automation and creating new features and content for you to make actionable intelligence decisions. Everyone that supports the site helps enable new functionality.

If you like the site, please support us on “Patreon” or “Buy Me A Coffee” using the buttons below

To keep up to date follow us on the below channels.