We are generating a huge amount of data from sensors, credit card transactions, and communications, and the rush to exploit this valuable new resource – celebrated as the "Oil of the 21st Century" – is just starting.

The more data is generated about us, the more it will be possible for companies to know us better than our friends and families, and better than the secret services around the world could ever hope.
Today, our individual motivations are analysed in order to manipulate our behavior. For example, we are now influenced by personalised search and advertising, or by the recommendations and decisions of our Facebook friends.
Because users can't see how underlying data is processed, it is hard to know how much we are being manipulated by web services and social media. But given the great economic potential, it is pretty clear that manipulation is happening.
Soon, Google will drive our cars, drones will eliminate people classified as dangerous, and computers will increasingly decide how much we have to pay for financial products based on our behavioural data as well as those of our friends, neighbours and colleagues.
Many people will be discriminated against in unjustified ways due to obscure "machine learning" algorithms, which are neither sufficiently transparent nor legally regulated in terms of their quality standards.
Designing for social diversity
Why should a company decide what is good for us? Why can we not choose the recommendation algorithms ourselves? Why do we not get access to the data?
Would we only adjust to what others expect from us, many new ideas would not originate nor spread. Social diversity would decrease, and thus the ability of our society to adapt.
Innovation requires the protection of minorities and new ideas. It is an engine of the economy. Social diversity also promotes happiness, social well-being, and the ability of our society to recover from shocks (the "resilience").
Strongly variable, highly complex systems – such as societies – cannot be properly managed by planning, optimisation and top-down control.
Instead, societal decision making and economic production processes should be run in a much more complex, participatory way, much like the decentralised self-organisation principles that drive the economy and organisation of the internet.
One day, advanced collaboration platforms will allow anyone to set up projects with others to create their own products, for example with 3D printers, so it might be that classical companies and political parties as institutions will increasingly be replaced by project-based initiatives.
But to ensure that this participatory market society will work well and create jobs on a large scale, we need to make the right decisions and a much better understanding of our techno-socio-economic-ecological systems and their interdependencies.
If a system is designed in the wrong way, then it will get out of control sooner or later, even if all actors involved are highly trained, well equipped and highly motivated to do the right things.
The Information Age has fueled the dream that God-like omniscience and omnipotence can be created by man, but our society and legal systems are not well prepared for this.
Appropriate institutions and rules for our highly networked world must be found; if we do not pay sufficient attention to these developments, we will suffer the fate of a car driving too fast on a foggy day.
Dirk Helbing is a Professor of Sociology at ETH Zurich. He is coordinating the FuturICT initiative, which focuses on the understanding of techno-socio-economic systems using Big Data. See the original version of his essay, ‘Google as God’, here.