Some UK Stores Are Using Facial Recognition to Track Shoppers 

“The data is then held stored and shared proportionally with other retailers creating a bigger watchlist where all benefit,” a spokesperson for Facewatch says. Its website claims it is the “ONLY shared national facial recognition watchlist” and the watchlist works by essentially linking up multiple private facial recognition networks. It adds that since the Southern Co-op trial it has started a trial with another division of Co-op.

Facewatch refuses to say who all of its clients are, citing confidential reasons, but its website includes case studies from petrol stations and other shops in the UK. Last year, the Financial Times reported Humber prison is using its tech, as well as police and retailers in Brazil. Facewatch said its tech was going to be used in 550 stores across London. This can mean huge numbers of people have their faces scanned. In Brazil during December 2018, 2.75 million faces were captured by the tech with the company founders telling the FT it reduced crime “overall by 70 percent.” (The report also said one Co-op food store around London’s Victoria station was using the tech.)

However, civil liberties advocates and regulators are cautious of the expansion of private facial recognition networks, with concerns about their regulation and proportionality.

“Once anyone walks into a Co-op store, they’ll be subject to facial recognition scans… that might deter people from entering the stores during a pandemic,” says Edin Omanovic, an advocacy director who has been focussing on facial recognition at NGO Privacy International. The group has written to Co-op, regulators and law enforcement about the use of the tech. Further than this, his colleague Ioannis Kouvakas says the use of the Facewatch technology raises legal concerns. “It’s unnecessary and disproportionate,” Kouvakas, a legal officer at Privacy International, says.

Facewatch and Co-op both rely upon their legitimate business interests under GDPR and data protection laws for scanning people’s faces. They say that using the facial recognition technology allows them to minimize the impact of crimes and improve safety for staff.

“You still need to be necessary and proportionate. Using an extremely intrusive technology to scan people’s faces without them being 100 percent aware of the consequences and without them having the choice to provide explicit, freely given, informed and unambiguous consent, it’s a no go” Kouvakas says.

It’s not the first time Facewatch’s technology has been questioned. Other legal experts have cast doubt on whether there is a substantial public interest in using the facial recognition technology. The UK’s data protection regulator, the Information Commissioner’s Office (ICO), says companies must have clear evidence that there’s a legal basis for these systems to be used.

“Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity,” a spokesperson for the ICO says. The ICO is investigating where live facial recognition is being used in the private sector and expects to report its findings early next year.

“The investigation includes assessing the compliance of a range of private companies who have used, or are currently using, facial recognition technology,” the ICO spokesperson says. “Facewatch is amongst the organizations under consideration.”

Part of the ICO’s investigation into private sector facial recognition use includes where police forces are involved. There is growing concern around how police officials and law enforcement may be able to access images captured by privately run surveillance systems.

In the US, Amazon’s smart Ring doorbells, which includes movement tracking and face recognition, have been setup to provide data to police in some circumstances. And London’s Met Police was forced to apologize after handing images of seven people to a controversial private facial recognition system in Kings Cross in October 2019.

Leave a Reply