Fukushima's nuclear power plant meltdown in Japan and the crash of Air France Flight 447 were examples of technological "Frankenstein" monsters caused by chaotic computer systems, a US security expert told a Gold Coast computer conference today.
Verizon vice president of national security policy Marcus Sachs warned that society was still dependent on 19th and early 20th century processes when the complexity of modern computer systems demanded fresh thinking to avoid such tragedies recurring.
He pointed to high-profile "accidents" that could be acribed to complex systems spinning out of the control of their human handlers.
"Have we got this creature out there roving around, we call it the internet, we call it the air traffic control system, we can call it the financial banking network, but is it truly a Frankenstein?" Sachs said.
"It not only has muscles and fingers [that] can do mechanical things, it also has an artificial brain and it is out of its cage and it is out of control.
"Have we released a monster or is there some way to regain control?"
Sachs pointed to emergent theory as the root cause of many recent disasters that also included the Dow stock market crash, a Russian hydroelectric dam malfunction and possibly Stuxnet.
Emergence attempts to prove how complex behaviours arise out of simple patterns of behaviour, such as how fish schooled, birds flocked or pedestrians huddled together to avoid a hole in the footpath. The simple actions of individuals when mobbed made for patterns that may have unintended, unmanageable and possibly deadly consequences, he said.
"We can see this emergence happening everywhere; can we understand it and control it and make it work for us?" Sachs said.
He said the West needed to "re-learn management" if it was to cope with the networked systems it had created: "We need to change our linear thinking. We need to become chaotic ourselves."
"With large, chaotic systems we will have to take a different view to secure them, to make them safe.
"Your organisation is probably devised very linear, very predictable. You have hierarchies, bosses and subordinates, but what you're trying to manage is completely different. It's chaotic, it's ad hoc, it's changing but our organisations are still built like in the industrial age.
"How do we control Frankenstein if our own organisations aren't even built that way? We're still stuck in this era of large mechanical devices rather than this era of large artificial intelligences."
Sachs said industrial-era controls or "simple levers" were insufficient to cope with modern systems such as the power grid that had computer network overlays to control their operations. And he held out the prospect that the unscrupulous may be faster to adapt to changes than civil society or government.
"Are the criminals of the underworld retooling themselves and going after emergent behaviour? I don't know if that type of thinking is there quite yet but that's also emerging."
But for all the care that systems administrators and policy wonks may apply to handling complex systems, it was impossible to reach zero risk, Sachs said.