‘Big Data’ is emerging as one of the Big Issues of the knowledge economy.
These immense datasets have immense potential to provide us with economic gain, offer individuals free, customised services, drive innovation and much, much more.
But there is a catch or two: Can we gain from the enormous economic benefits of Big Data while maintaining privacy? And is it time for an ethical approach to search and personalisation?
Last December, I was privileged to attend an OECD roundtable on ‘The Economics of Personal Data and Privacy’, where we discussed how personal data is being used, and the economic and social values generated by these uses.
A report on the Roundtable is available and should be read widely.
Arguments put forward in favour of Big Data were impressive; the most famous is the oft-quoted story of Google Flu Trends, which studies search terms to identify outbreaks even before authorities can spot them.
While there is the need to protect individuals’ privacy, OECD panellists identified a need to enable companies to share information to add context and value to data.
GigaOM writer Derrick Harris raised similar concerns in a blog post, titled ‘Will a Crackdown on Privacy Kill Big Data Innovation’, on 16 May.
Harris quotes ‘Big Data: The next frontier for innovation, competition, and productivity’ by the McKinsey Global Institute, which identified six issues facing policymakers:
- Build human capital for big data.
- Align incentives to promote data sharing for the greater good.
- Develop policies that balance the interests of companies wanting to create value from data and citizens wanting to protect their privacy and security.
- Establish effective intellectual property frameworks to ensure innovation.
- Address technology barriers and accelerate R&D in targeted areas.
- Ensure investments in underlying information and communication technology infrastructure.
Of these, he identifies the third as the critical issue. I would agree.
Customisation, or subtle censorship?
At the TED conference this year, Eli Pariser spoke about how tailored experiences on the internet is leading to censorship and filtering that is so subtle that most of us don't even notice it.
Importantly, Pariser goes on to show that this is not a new phenomenon; it is just an expression of previous behaviour of media barons, journalists and editors, in a new medium.
Pariser notes that traditional media censorship was a human process for which a human solution was developed: an ethical framework. We may argue about its efficacy, but at least it has made things better.
So, he asks: When will we develop an ethical framework for the algorithms that deliver us our custom-made services based on Big Data?
Pariser proposes that algorithms in search, social Networks and elsewhere should ensure that we see results that are:
- other points of view
This is an important contribution. It suggests that there is more to the appropriate and inappropriate use of Big Data than privacy and the right to be let alone, made popular by Warren & Brandeis in 1890 in The Right to Privacy.
There is also a huge risk that we won't even know what we don't know, as we live our lives in a Filter Bubble.
Pariser was given a standing ovation for his talk. So he should.
Can we gain from the enormous economic benefits of Big Data while maintaining privacy and not fall into its pitfalls?
Is it time for an ethical approach to the algorithms behind search and personalisation?
Malcolm Crompton is the managing director of Information Integrity Solutions (IIS), and a former Privacy Commissioner of Australia. His post, ‘Big Data: Our Future or Censor?’, first appeared on OpenForum.com.au.