Google Australia alarmed by breadth of online safety laws, govt haste

By on
Google Australia alarmed by breadth of online safety laws, govt haste

Wanted bill refined before being put to parliament.

Google Australia is concerned the breadth of proposed online safety laws could snare unsuspecting cloud infrastructure services and see companies over-remove content in a bid to comply.

The internet giant made the comments in a submission to the exposure draft of the Online Safety Bill before the bill was introduced to federal parliament last week, just 10 days after an earlier consulation process finished.

If passed, the bill will establish a cyber abuse takedown scheme for adults, requiring social media services, designated internet services and hosting services to remove material within 24 hours.

It will also give the eSafety Commissioner the power to require ISPs to block access to domain names, URLs or IP addresses containing abhorrent violent material, formalising an existing ad-hoc scheme.

Google Australia used another submission [pdf] - this time to the senate committee reviewing the bill - to highlight its alarm at the rate at which the legislation is moving through parliament.

“This bill was introduced into the house of representatives a mere 10 days after public consultation period on the exposure draft of the bill closed,” head of government affairs Samantha Yorke said.

Yorke said that Google Australia had previously made “several constructive suggestions for amendments (designed to enhance the law)” in its submission to the exposure draft, which – like more than 370 other submissions – is yet to be published by the government.

In its original submission, Google Australia raised concerns with the scope of the bill, which it said should be limited to content sharing services like social media and video sharing services.

“Any expansion to the scope of services subject to both the cyber bulling and cyber abuse schemes should be carefully limited and tailored, recognising relevant differences between services,” it said.

“Rules that make sense for social networks, for instance, do not necessarily make sense for other types of platforms or services.”

Google Australia said that, in its current form, the scheme appeared to extend to “cloud-based infrastructure platforms that third-party businesses use to provide services to their clients”.

But as cloud providers “typically do not have visibility into customers’ content”, it would be “challenging if not impossible for Google Cloud’s business” to comply with certain obligations.

“Even if something was flagged by an external observer, it is often impossible for a cloud provider to remove individual pieces of content,” the submission states.

“Therefore, a request from the eSafety Commissioner to remove one single piece of content could result in a [cloud] provider being mandated to remove a customer’s entire website.”

The company said “similar challenges” could arise for its app distribution platform Google Play, which it said “does not have the ability to remove individual pieces of content from within an app”.

Google Australia also wants to understand the “perceived need” to reduce the turnaround time for removing content from the current 48 hours to 24 hours when compliance is not in question.

“The eSafety Commissioner has made repeated reference to the fact that most platforms remove content upon receiving a request from her office very promptly,” its submission states.

The company noted that “some takedown requests can be complex and necessarily take time to assess thoroughly”, while others may not provide sufficient information, at least initially.

“Specifically, an exact turnaround time, regardless of complexity of case, provides an incentive for companies to over-remove, thereby silencing political speech and user expression,” Goolge said.

“In addition, quick and prescriptive turnaround times and unexpected spikes in volume place a significant pressure on content reviewers or moderators (who are already looking at difficult content) to make quick decisions about content that in some cases are incredibly nuanced and complex.”

Google Australia has suggested that a “more workable standard would be one that instructed online platforms to remove content ‘with all due speed’, ‘without undue delay’, ‘or expeditiously’ upon receipt of a clear and specific notice”.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © . All rights reserved.

Most Read Articles

Log In

  |  Forgot your password?