top of page

What can Australia learn from the UK Government’s new guidelines on deployment of face recognition technology?

face 5129 hi res_2.jpg

it is easy to see how this best practice can be applied to use of the technology to protect Australian citizens and businesses…

The UK Home Office has just published guidelines on how face recognition technology is used by UK police to stop crime and protect their citizens.

In July 2020 the Office of the Australian Information Commissioner (OAIC) and the UK Information Commissioner’s Office (ICO) announced that they would work together to investigate Clearview AI, Inc. 

The ICO and OAIC worked together on the evidence-gathering stage of the investigation. As both data protection authorities operate under their own country’s legislation, any outcomes were considered separately. 

However, following the collaboration Australian Information Commissioner and Privacy Commissioner Angelene Falk said “The joint investigation with the ICO has been highly valuable and demonstrates the benefits of data protection regulators collaborating to support effective and proactive regulation.”

Each authority has also been looking separately at their respective police forces’ use of the technology. However, now guidelines have been published on the UK Home Office website of how the UK police will use face recognition going forward.

The OAIC have been expected to report this year on the results of their investigations and deliberations into use of FR in Retail in Australia. It now looks unlikely that the authority will report until the first quarter of next year at the earliest. 

Critics of the AOIC have pointed out that in the meantime crime is escalating, abuse of stakeholders is continuing to rise and that every day that passes sees more victims of crime which arguably would have been prevented or consequences mitigated if clear guidance was available on how face recognition could be used by businesses in Australia. 

So, in the absence of their report, the UK Home Office publication does perhaps throw some light on what the OAIC might conclude about the use of face recognition to fight crime…

Whilst the guidelines, which appear on the UK Government website/blog are designed to codify police use of face recognition, it is easy to see how this best practice can be applied to use of the technology to protect Australian citizens and businesses.

The guidelines identify three scenarios in which face recognition is used:  

  1. Retrospective Facial Recognition (RFR)

  2. Live Facial Recognition (LFR)

  3. Operator Initiated Facial Recognition (OIFR) 


Retrospective Facial Recognition

RFR is used after an event or incident as part of a criminal investigation. Images are typically supplied from CCTV, mobile phone footage, dashcam or doorbell footage or social media. These images are then compared against images of people taken on arrest to identify a suspect.  

A South Wales Police study found that without RFR identifications take around fourteen days, whereas with it they typically take minutes.   

All police forces use the Police National Database facial search facility.  

Real life examples of how forces used this technology to catch criminals and keep people safe include:  

  • Craig Walters was jailed for life in 2021 after attacking a woman he followed off a bus. He was arrested within 48 hours of the incident thanks to South Wales Police using facial recognition on images captured by CCTV, including on the bus. 

  • Images taken by a member of the public inside a Coventry nightclub where a murder had taken place were quickly matched on the police national database to a known individual. The victim’s blood was found on his clothing; he was charged and sentenced to life imprisonment.

  • A two-year operation was undertaken to disrupt, deter and build intelligence of drug operations within communities in Cardiff. RFR identified those involved and generated an intelligence picture of drug dealing in the city. This led to the arrest of 69 drug dealers, with 64 being charged. 44 people are serving custodial sentences totalling 117 years.


Live Facial Recognition

Live Facial Recognition (LFR) enables police to identify wanted people, a core part of policing. Every day, police officers are briefed with images of suspects to look out for. Over many years, officers have spotted individuals from these briefings and taken action to keep the public safe.  

LFR does what the police have always done but much more quickly and with greater accuracy.  

The technology uses live video footage of crowds passing a camera and compares their images to a specific list of people wanted by the police.  The technology can precisely pick a face out of a dense crowd, something which would be impossible for an officer to do. It means the police can quickly and accurately identify wanted criminals and take them off the streets. 

If the LFR system does not make a match with the watchlist, a person’s biometric data is deleted immediately and automatically.  The watchlist is destroyed after each operation.   

Real life examples include:   

  • At the Arsenal v Tottenham soccer match on 24 September 2023, it led to three arrests, including a suspected sex offender.

  • A wanted sex offender was sent back to jail after being identified at the Coronation of King Charles. An image of his face matched that of a wanted suspect. He was arrested and sent back to prison for breaching the terms of his release.

  • Over two busy Friday nights in Soho, London in August 2023 the Metropolitan Police used it to help find high harm offenders.   Across the two deployments there were six accurate alerts and no false alerts.  It led to the police engaging with six people, five of whom were arrested including a man wanted for possession of a bladed article and a woman wanted for breach of bail in relation to robbery.    


Operator Initiated Facial Recognition (OIFR)  

OIFR is a mobile App that allows officers, after engaging with a person of interest, to photograph them and check their identity where they are not sure, without having to arrest them and taken them into custody.   

It is at the early trial stage but has been showing positive results.  

It is interesting to note that face recognition is NOT just confined to identifying criminals. The official position embraces the use of the technology to also find missing or vulnerable people saying that to do so “…also frees up police time and resources, meaning more officers can be out on the beat, engaging with communities and carrying out complex investigations.”

Also of great interest is the note regarding the police experience of face recognition accuracy. 

The point is made that the accuracy depends on the algorithm used. This is one of the reasons that the National Institute for Science & Technology rankings of face recognition systems that they conduct independently every year are so important – and why we are proud that our Vix Vizion Imagus software has been in the top 3 in the world every year since its inception on key test benchmarks. 

In the case of the UK The National Physical Laboratory (NPL) tested the algorithms South Wales Police (SWP) and the Metropolitan Police Service (MPS) have been using and found that:

  • There were no statistically significant differences in performance based on gender or ethnicity in the way they use it

  • In practice, at the time of publication, there have been no false alerts this year

  • RFR and OIFR had demonstrated 100% accuracy in identifying a correct match with no false matches   

The finding that there are no statistically significant differences in performance based on gender or ethnicity will be encouraging to other police forces around the world and unwelcomed by claimants seeking to establish an “ethnic bias” of face recognition systems. 


For example, a 32 year old Detroit woman, named Porcha Woodruff, claimed that she was falsely arrested while 8 months pregnant and spent 11 hours on a concrete jail bench before being released as a result of a bogus facial recognition match. She filed a lawsuit against the City of Detroit and Detroit Police seeking US $25M in damages stating, “Face recognition technology has long been known for its inherent flaws and unreliability particularly when attempting to identify black individuals, such as Porcha Woodruff.”The NPL findings would seem to refute this. 


Ms Woodruff’s case has yet to come to court.


A full copy of the guidelines can be found at Police use of Facial Recognition: Factsheet - Home Office in the media (

For more information about Vix Vizion’s Imagus system go to

Nov 22, 2023,

Vix Vizion Team

  • LinkedIn
bottom of page