Australia could be a leader in safe metaverse: Standards Australia

By

Findings from a new whitepaper.

Australia has the potential to be a global leader in the safe deployment of the metaverse paving the way for innovation and shaping industries, while setting a benchmark for the world in this space, a new report has found.

Australia could be a leader in safe metaverse: Standards Australia

A Standards Australia whitepaper noted that for Australia to meet its potential and prevent targeted influence and manipulation in the metaverse, it must build upon existing work in the areas of online safety and safety by design.

Standards should be built in the following areas, right to experiential authenticity, right to emotional privacy, right to behavioural privacy, and right to human agency.

Standards Australia has launched its Metaverse and Standards whitepaper, which was produced in collaboration with the Responsible Metaverse Alliance (RMA).

The whitepaper, developed with support from the federal government, explores the opportunities and risks stemming from the metaverse, and its potential impact on society and the economy. 

The report highlighted that there is an opportunity for Australia to be a global leader in the safe deployment of the metaverse.

Kareen Riley-Takos, general manager of operations at Standards Australia said there is a need for standards in the metaverse

“The implementation of standards is a crucial step towards the safe deployment of the metaverse for all groups, especially young people under 16 years old, who make up two-thirds of the users of the metaverse,” she said.

“With Australia having one of the highest online penetration rates in the world - at 91 percent in 2022 - it is critical that the Australian government and Standards Australia execute standards and regulations to keep Australians safe.” 

Dr Catriona Wallace, founder of the Responsible Metaverse Alliance explained the dark side of the metaverse and how Australia needs to get ahead of these issues.

She said, “Generative AI used to operate avatars in the 'dark metaverse' also known as the ‘darkverse’, have been used to groom and blackmail children, exposing the serious dangers of an unpoliced and unregulated Metaverse.

"Immediate regulation and monitoring are necessary to prevent such crimes."

Wallace said the dangers of occurrences like this, particularly those that could put children at risk, amplify the need for standards to be deployed with guidelines to safeguard vulnerable groups.

"There are currently no specific regulations in place for the metaverse aside from existing data protection and privacy laws, intellectual property laws and criminal laws," she ended. 

Digital Nation is hosting its inaugural Digital As Usual event investigating the impact of technology on marketing, finance and HR. To register, click here.

Got a news tip for our journalists? Share it with us anonymously here.
© Digital Nation
Tags:

Most Read Articles

Westpac pilots AI to analyse inbound call content

Westpac pilots AI to analyse inbound call content

King & Wood Mallesons Australia to give Gen AI tool to 1200 lawyers

King & Wood Mallesons Australia to give Gen AI tool to 1200 lawyers

BHP sets sights on enterprise-wide AI transformation

BHP sets sights on enterprise-wide AI transformation

Telstra eyes AI multi-agent systems for its processes

Telstra eyes AI multi-agent systems for its processes

Log In

  |  Forgot your password?