The Australian government’s insistence that its planned decryption laws can be safely applied to a small number of users without weakening security for all users is - at best - “public-relations puffery”, cryptographic experts warn.
The experts are among a snowballing wave of opposition to the government bill, the draft text of which was released mid-August.
The Department of Home Affairs is currently consulting on the draft text, and although it has not publicly disclosed consultations within that process, a number have leaked out over the past 24 hours.
Notably, Riana Pfefferkorn, cryptography fellow at the Stanford Center for Internet and Society, and University of Melbourne researchers Dr Vanessa Teague and Dr Chris Culnane, have laid out significant concerns with how the planned laws would work in reality.
One of the main problems is that the law would require technology companies and providers to target a single device or small number of devices, but only in a way that does not introduce a “systemic weakness” that impacts all users.
Pfefferkorn argues this will not work in practice.
“If a provider is forced to enable access to a ‘particular service, particular device or particular item of software’, there is a significant chance that the provider’s ‘one-off’ solution in fact will not be limited to the specific device,” she said.
“Australian law enforcement and security agencies will foreseeably amass a large number of devices to which they will require providers to grant them access.
“Since the Bill forswears a systemic backdoor requirement, it follows that Australian investigators will instead repeatedly importune providers for ‘one-off’ access to every single device.
“Consequently, to render prompt, efficient access to numerous “particular” devices at scale, providers will need to come up with a solution that is effectively ‘systemic’.”
Pfefferkorn said that if Australia passed the decryption bill, other jurisdictions would follow suit, multiplying the number of “one-off” requests aimed at providers to break security protections.
“Providers are unlikely to build from scratch, and then dispose of, a custom, tailored solution to access each particular device, every time they are served with a government access demand,” she said.
Instead, she argued, given the future demand for device or communications access, the provider would keep the ‘one-off’ solution on hand and either modify it slightly each time, “or, more likely, create a solution that does not tie the access capability to one particular device.”
That would “largely vitiate” the bill’s insistence that no backdoors or “systemic weaknesses” be created in products and services to comply with law enforcement demands.
Pfefferkorn suggested the limit on creating backdoors is, at best, “simply public-relations puffery intended to mollify the Australian public’s well-founded concerns about the Bill.
“At worst, it reflects a serious misunderstanding of computer security on behalf of the government,” she said.
New bugs and other vectors
Pfefferkorn also raised concerns that “one-off” code used to circumvent encryption could contain bugs that create “unanticipated new paths for bypassing security and exploiting the device (or service or item of software).”
Further, she said there appeared to be no provision that allowed a buggy compliance measure to be revoked, even if it led “to a widespread negative security impact”.
And she said that if a circumvention was called upon frequently by law enforcement, it could also become valuable to malicious actors and potentially fall into the wrong hands.
Drs Teague and Culnane similarly warned in a submission of their own that “any improvement in law enforcement access needs to take into account the likelihood that criminals will use the same access vector.”
“The main risk of this legislative program is, by focusing solely on the law enforcement aspect, to underestimate the consequences of undermining cybersecurity for the millions of ordinary Australians who are much more likely to be the target of cybercriminals than of a police investigation,” they said.
Dr Teague suggested that one-off workarounds such as that used by the FBI to unlock an iPhone in the now-infamous mass shooting case in San Bernardino would be disabled by device makers once the vector became clear.
“Dr Teague’s prediction for the resolution of the Apple/FBI controversy is that Apple will (if they haven’t already) design a phone that does not accept firmware updates without both the user’s pincode and online evidence that the same update is being sent to all users,” Drs Teague and Culnane said in their submission.
“Other device manufacturers will quickly follow. This will defend users against sophisticated
targeted malware (unless it is sent to everyone) and will also have the side effect of rendering any court order against Apple for a targeted firmware update moot.”
The two academics indicated that providers may not have to create new systemic weaknesses in their products but simply exploit existing ones.
“One popular suggestion is to add a surreptitious participant to an end-to-end encrypted group communication,” they said.
“Many end-to-end encrypted services such as Skype or Zoom allow groups to communicate together - the security of this process relies heavily on a non-cryptographic user interface that shows participants who has joined in their chat.
“The software could easily be tweaked to suppress some participants, so that the members of the group didn’t even know that their encrypted communications were also being sent to another party.”
Technical review and informed choice
Drs Teague and Culnane also argued that “open, independent review” of methods used to break security should be allowed.
They are firmly against parts of the proposed bill that would criminalise the disclosure of weaknesses used to comply with law enforcement requests; the bill tries to make the nature of these weaknesses a matter for only law enforcement and technology providers, leaving regular users in the dark.
“Any proposal for exceptional access should mandate the release of enough public detail about technical mechanisms being required to allow independent analysis and user choice based on as accurate as possible an understanding of the consequences for the security of ordinary users,” Drs Teague and Culnane said.
They said it was not enough to trust technology companies not to do things that endangered the interests of all users.
“Ordinary users should have the opportunity to walk away based on their understanding of their risks, even if the corporation consents to the risks they are being asked to put their users’ data to,” Drs Teague and Culnane said.
“Public awareness of the extent or usage of surveillance tools is critical to allowing ordinary consumers to make appropriate risk-management decisions about the trust they place in technology.”