King’s Cross facial recognition cameras under investigation

0

The devices pose a “potential threat to privacy that should concern us all”, the information commissioner says.

An investigation has been launched into the use of facial recognition cameras in London’s King’s Cross.

The Information Commissioner’s Office says it is “deeply concerned” by reports that the controversial technology was being used at King’s Cross Central.

Major rail stations King’s Cross and St Pancras – as well as restaurants, cafes, homes and a Google campus – are based at the 67-site acre site.

It is not clear how long the system has been active, nor how many cameras are in use. Thousands of people pass through the area on a daily basis.

Information Commissioner Elizabeth Denham said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.

“That is especially the case if it is done without people’s knowledge or understanding.

“My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”

Ms Denham added that “detailed information” is being requested concerning how the technology is used so the ICO can determine whether it complies with data protection law.

Earlier this week, London Mayor Sadiq Khan wrote to the chief executive of the development to raise his concerns about the use of facial recognition.

Argent, the property developer for the area, has told the Financial Times that “sophisticated systems [are]in place to protect the privacy of the general public”.

The Canary Wharf area of London is said to be considering a similar scheme.

Last month, Sky News revealed the results of an independent evaluation into the Metropolitan Police’s use of facial recognition technology.

It found that the system had an error rate of 81%, meaning four in five matches were incorrect.

The evaluation also found issues with the force’s “watchlists” – lists of people who have had photos of their faces entered into the system by police – warning of “significant ambiguity” and an “absence of clear criteria for inclusion”.

Share.

About Author

Leave A Reply