The research found that the police should be prohibited from deploying live face recognition (LFR) technology in all public locations as they are violating ethical norms and human rights regulations.

In LFR, cameras are connected to libraries of previously taken pictures of individuals. The camera data may be compared to the images in the photographs to ensure consistency.

In a report by The Guardian, British authorities have tried out the technology in the hopes that it would aid their fight against criminal activity and terrorism.

However, in a few instances, the courts have ruled against the police for their use of LFR and how they handled violations of the privacy rights of pedestrians in areas where the technology was deployed. Racial prejudice is also a cause for alarm.

Related Story: UK Watchdog Says Emotion AI Tech Is 'Immature;' Urges Policymakers to Regulate Biometric Technology

Banning the Usage of LFR

Even though police feel LFR would be most helpful in public places like streets and airports, a new study from the University of Cambridge's Minderoo Centre for Technology and Democracy recommends banning its users everywhere.

One LFR deployment by the Metropolitan Police Department and two by the South Wales Police Department were analyzed in the research. The Guardian was told by both armies that they had improved and that they now saw the value of LFR.

According to the author of the report, Evani Radiya-Dixit, they found that all three of these deployments failed to fulfill the minimal ethical and legal criteria based on their study on police use of face recognition.

"To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology and also move from high-level values and principles into practice," Radiya-Dixit added.

Unsafe and Unethical Use of Data Systems

The research claims that many of the best practices for the secure and ethical use of large-scale data systems are not being implemented in the police department's use of face recognition technology.

The issue here extends well beyond the potential for discrimination caused by face recognition software.

Human rights advocates are concerned that it might lead to widespread violations of rights, including protest and assembly.

LFR as a Tool to Combat Crime

Within the UK's police force, LFR is considered a game-changer similar to the advent of fingerprinting technology in reducing crime. It may improve one's capacity to track and find them.

The technology has been employed as part of authoritarian governments' arsenal of coercive weapons in places like China.

The Met Police claimed the algorithm had improved greatly with cooperation from the National Physical Laboratory and the Defence Science and Technology Laboratory, with a false warning rate of less than 0.08%.

The Met engaged Pete Fussey from the University of Essex to evaluate their past LFR experiments. His assessment was critical. By 2020, the Met projected a 70% success rate; Fussey said it was only 19%.

Also Read: Texas Files Lawsuit Against Google for Gathering Biometric Data Without Permission

This article is owned by Tech Times

Written by Trisha Kae Andrada

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion