The Australian Federal Police on Monday morning launched a new artificial intelligence research lab at Monash University to detect exploitative or abusive digital media faster without unnecessarily over-exposing staff to confronting images.
The Artificial Intelligence for Law Enforcement and Community Safety (AiLECS) Lab builds on work by AFP agent Dr Janis Dalins, who last year developed an AI image classifier tool with the university and Data61 to automatically identify and classify explicit material during investigations.
Dubbed the Data Airlock, the system will also be used to make data from disturbing images available to academics to develop further machine learning algorithms without being exposed to confronting data.
Initially, the focus of the research has been on child exploitation material, but over time the image analysis software is being expanded to cover content from terrorism cases that can cause significant psychological distress for investigators.
Material is mainly sourced from a devices seized during investigations, and furthers Dalins’ PhD research which used The Onion Router (TOR) to crawl the dark web, testing a classification model to capture dark web user behaviour and motivation that can be applied in law enforcement.
“The ultimate goal of this initiative is to ethically research the use of machine learning and data analytics in advancing law enforcement and community safety,” Dalins said in a statement ahead of the AiLECS Lab launch in Melbourne.
Dalins’ co-director at the lab, and associate dean (international) in Monash’s Faculty of Information, Dr Campbell Wilson, added that the Data Airlock and other work at the lab will make it easier and faster to copy and automatically review data than investigators are currently able to do.
“What machine learning algorithms do is give us speed and portability – think hundreds of images per second. But machine learning won’t outperform the accuracy of experienced human investigators, which are essential to each case,” Wilson said.
However, any time saved during investigations could lead to faster identification of victims and the source of illegal material, and the removal of images from digital platforms.
The lab is supported by $2.5 million funding, part of the new Monash Data Futures initiative to train data scientists in the use of AI for social good.
The AFP and Monash also plan to roll out the Data Airlock to international researchers to contribute to global efforts at reducing harm to children and terrorism.