Victoria Police has admitted to trialling the controversial facial recognition tool Clearview AI after previously denying that officers had used the software.
Documents released under freedom of information this week confirm at least five police officers signed up to use the software as part of a trial from late 2019.
The FOI request was made by IT consultant and analyst Justin Warren after the force declined to confirm its use of the software following news reports earlier this year.
Victoria Police was one of 2200 law enforcement agencies globally outed by BuzzFeed News for having had personnel use Clearview AI at one point or another.
Leaked records showed that more than five users had run more than 10 searches using the tool, which is capable of matching images with billions of others scraped from the internet.
FOI documents reveal correspondence between Clearview AI and several officers, including two intelligence analysts, a detective senior constable and sergeant, who were invited to sign up.
The emails pitch Clearview as a “Google search for faces”, with officers able to upload photos to “instantly get results from mugshots, social media, and other publicly available sources”.
“Our technology combines the most accurate facial identification software worldwide with the single biggest proprietary database of facial images to help you find the suspects you’re looking for,” one of the emails reads.
Once the five officers signed up, they were also encouraged to “search a lot” and refer other colleagues to sign up to the tool.
"Investigators who do 100+ Clearview searches have the best chances of successfully solving crimes with Clearview in our experience,” another email reads.
“It’s the best way to thoroughly test the technology.”
The documents do not indicate how many of the officers actually used the software to perform searches, though one user applied to have their password changed after Clearview’s client list was exposed in a data breach.
Several officers did not take up invitations from Clearview to sign up.
In FOI correspondence, Victoria Police said the documents identified related to a “trial that was completed by the [Joint Anti-Child Exploitation Team (JACET)] which has now ceased”
“The trial involved open-source images uploaded to the system for testing purposes,” the force said.
"The use of ClearView AI is not endorsed by Victoria Police. ClearView AI has not been utilised for investigative purposes at Victoria Police."
Victoria Police stressed that “no commercial or formal agreements are in place with the organisation as the trial was conducted at no charge”.
“iFACE is the official Victoria Police facial recognition system and is used as an investigative tool only."
The Australian Federal Police has also been forced to admit in recent months that nine of its officers piloted Clearview AI to test the tool's suitability for use in child exploitation investigations.
It has similarly stressed that the platform has not been adopted as “an enterprise product” and that it has “not entered into any formal procurement arrangements”.
A number of large tech firms have recently followed IBM’s lead and committed to stop offering facial recognition software in the US.