Social media giants are forging ahead with plans to introduce end-to-end encryption on their platforms “without regard” for public safety, according to the Department of Home Affairs.
Fronting a parliamentary inquiry into online safety, departmental representatives raised ongoing concerns with the prioritisation of security and privacy, while calling for visibility into algorithms.
The comments come despite passage of the Surveillance Legislation Amendment (Identify and Disrupt) Act, which aimed to tackle serious crime "enabled" by anonymising technology.
Law enforcement agencies also already have potential ways to access encrypted services under the controversial Telecommunications and Other Legislation Amendment (Assistance and Access) Act passed in 2018.
Digital and technology policy first assistant secretary Brendan Dowling told the inquiry that platforms were rolling out encryption “without the associated consideration of safety features”.
“We are deeply concerned that innovation in digital tools, including anonymising technology like end-to-end encryption, is not striking the right balance between the benefits and the risks of harm,” he said.
While Dowling did not refer to any specific platform, the department called out Meta for its “seeming indifferent to public safety imperatives” in its submission [pdf] to the inquiry in January.
Meta is planning to introduce end-to-end encryption on Facebook and Instagram in 2023, having recently delayed the proposal following concerns it will shield child abusers from detection.
“We recognise there are substantial benefits, particularly to cyber security and privacy, from the use of encryption,” Dowling said on Tuesday, adding that the department isn't “anti-encryption”.
“But we do see that the adoption of ubiquitous encryption across more and more platforms will have serious and real implications for safety, particularly around the proliferation of child abuse.”
Dowling added that while mechanisms to identity known child abuse material in an encrypted environment existed, that would become harder with end-to-end encryption.
“There are technical ways to achieve the identification of that deeply troubling material,” he told the inquiry.
“But what we’re seeing is platforms looking to roll out further encryption to deal with privacy issues or security issues without regard to how they’re going to prioritise public safety [or] child safety assistance to law enforcement.”
Law enforcement policy first assistant secretary Ciara Spencer told the inquiry “transparency” from platforms is needed so the department can understand the steps platforms are taking to remove content.
“That requires an understanding not only of safety by design, but [also of] what platforms are actually doing and how they’re addressing those threats, and what mechanisms they’ve put in place,” she said.
Spencer added that while platforms had introduced “safety and mitigation measures”, child abuse is an area where offending continues to increase at “really disturbing and unprecedented levels”.
In its submission to the inquiry, the department said it has “significant concerns about the far-reaching consequences that pervasive design and algorithms have for both individual users and social cohesion more broadly”.
It said a number of solutions are being considered in other jurisdictions, including “improving the transparency and oversight of the use of pervasive design and algorithms” and “regulatory frameworks that prioritise child and community safety over business”.