Government criticised for 'racially biased' facial image database

21 May 2019 Consultancy.uk

The UK Home Office has come under pressure not to pursue the use of facial recognition database in the policing of Britain. UK police forces have been trialling a controversial facial image database, managed by IT consultancy CGI, but human rights campaigners have warned it will lead to black and minority ethnic people being falsely accused of crimes, while some have called for it to be “dropped immediately.”

Following a decade of spending cuts, Britain’s police force – like the rest of the public sector – is facing an uphill struggle to do more with less. With the total number of knife-related offences alone having spiked dramatically in recent years to approach the 40,000 mark in 2018, the Government has been relentlessly criticised for its perceived lack of results when it comes to keeping the country safe.

With a meaningful end to austerity in the UK still not in sight, as the Government looks to save what it can before the completion of Brexit shunts the nation’s economy into unknown territory, the resources it is willing to commit toward fighting crime remain sparse. As with every other facet of the state, then, the blanket solution is “technology”.Government criticised for 'racially biased' facial image database

Facial recognition technology was first introduced into the Police National Database (PND) in 2014, and several UK police forces have since been trialling controversial new automated systems which attempt to identify the faces of people in real time, as they pass a camera. Speaking to the BBC earlier in May, a spokesperson for the Home Office said the facial recognition technology developed by Cognitec “continues to evolve”, and could be an "invaluable tool" in fighting crime in the near-future.

For the last ten years, IT consultancy CGI – which is currently also working on a vote counting e-solution for London’s 2020 elections – has been contracted to build and operate the United Kingdom’s PND, last having its tenure extended by three years in 2016. The PND processes over 3 billion searchable records with over 4 million searches performed each year by officers licensed to access the database. At the same time it is managing the database of some 13 million faces, while the number of face match services done on the Database has grown from 3,360 in 2014 to 12,504 in 2017.

Despite the Home Office’s keenness to push ahead with facial recognition, however, it is being met with resistance by human rights campaigners, who have pointed out a number of issues with the technology. Privacy rights group Big Brother Watch claimed the method "must be dropped immediately," amid fears that the system is unable to accurately identify BAME individuals – who are already disproportionately stopped and searched by police – leading them to be falsely accused of more crimes.

According to the BBC, despite the Home Office being aware of this potential pit-fall, it has failed on three separate occasions to assess how well the systems deal with ethnicity, over the past five years. The same investigation also found that the UK police's former head of facial recognition knew that skin colour was a potential issue. Minutes from a meeting in April 2014 showed Durham Police Chief Constable Mike Barton noted "that ethnicity can have an impact on search accuracy," before he asked CGI, managing the police's facial image database, to investigate it. Subsequent minutes from the working group had no follow-up on this.

Lack of diversity

A lack of ethnic diversity in datasets seems to be at the heart of the problem. An interim report by a biometrics advisory group to the Government recently highlighted concerns about the under-representation of certain types of faces, particularly those from ethnic minorities, could mean bias "feeds forward" into the use of the technology.

Commenting on the issue, Silkie Carlo, Director of Big Brother Watch, said, "The police's failure to do basic accuracy testing for race speaks volumes. Their wilful blindness to the risk of racism, and the risk to Brits' rights as a whole, reflects the dangerously irresponsible way in which facial recognition has crept on to our streets."

A spokesperson for the National Police Chiefs Council defended the technology’s potential to disrupt criminals, however, they also admitted any roll-out must show its effectiveness within "sufficient safeguards". The source also told the press that that work was already being done to improve the system's accuracy.

The UK is not the only territory to have been pilloried for deploying potentially faulty technology to allegedly enhance security operations. In San Francisco, facial recognition will not be allowed to be used by local agencies, such as the city’s transport authority, or law enforcement. The news follows a vote of the city’s legislators, during which opponents of facial recognition argued the systems are error prone, particularly when dealing with women or people with darker skin.

Meanwhile, last year the European Union was accused of promoting pseudoscience by experts, after announcing plans for a “smart lie-detection system” at its busiest borders, in an attempt to identify illegal migrants. The system, which features animated interviews aimed at learning the intents of migrants at the EU’s busiest borders, was developed by a selection of professional services firms in partnership with a number of universities and government entities across Europe.

More on: CGI
United Kingdom
Company profile
CGI
CGI is a Global partner of Consultancy.org
Partnership information »
Partnership information

Consultancy.org works with three partnership levels: Local, Regional and Global.

CGI is a Global partner of Consultancy.org in Middle East, Africa, Asia, South Africa, Australia, Europe, India, Latin America, Netherlands, United Kingdom, Canada and United States.

Upgrade or more information? Get in touch with our team for details.