Unresolved privacy issues may be limiting technological and scientific progress, according to a prominent U.S. computer scientist.
Tom M. Mitchell of Carnegie Mellon University believes the rise of machine-learning algorithms during the past decade has yielded compelling potentials for mining and analysing real-time data.
Mitchell heads up Carnegie Mellon's Machine Learning Department, where he is researching uses for real-time location data from smart phones.
In a commentary published in the journal Science this week, he noted that an individual's smart phone could provide information on personal activities, conversations and movements.
Such information could be mined and analysed for the purpose of reducing traffic congestion and limiting the spread of disease, he wrote.
"If your phone company and local medical centre integrated GPS phone data with up-to-the-minute medical records, they could provide a new kind of medical service," Mitchell explained.
"If phone GPS data indicate that you have recently been near a person now diagnosed with a contagious disease, they could automatically phone to warn you."
But Australian privacy expert Roger Clarke is sceptical.
While the technology may be useful, Clarke said any benefits have to be counterbalanced by costs that include: intrusions into people's lives by warnings; unnecessary trauma caused by false positives; and any uncertainty that surrounds the epidemiology of infectious diseases.
Clarke is a visiting professor at the University of NSW and the Australian National University, and was awarded the 2009 Privacy Medal by the Australian Privacy Commissioner.
"As far as the Federal public sector is concerned, Australia has weak law, which has been progressively undermined by bureaucrats squeezing extra exceptions through," Clarke told iTnews.
"As far as the private sector is concerned, Australian law is atrocious," he said.
Mitchell suggests that the right application of technology might actually limit threats to privacy and misuse of data more so than enable privacy abuses.
One approach could be for each organisation to analyse its own data set, and share the results - but not the raw data - with other organisations performing other parts of the research, he said.