Financial services organisations need to move more quickly when it comes to the practical implementation of big data analytics.
Many of the larger banks and insurance companies are just taking too long to make this happen, and there are no excuses as we, as an industry, have been talking about it for a long time.
Although many institutions may be working on big data initiatives, customers are not seeing much come out of it.
Banks and insurance companies have access to vast amounts of data that they have accumulated through customer interactions and transactions.
Banks have the benefit of knowing hard data about their customers' earnings, transaction histories and much more. The best part is that this information is gleaned from authenticated customers, unlike some of the big retailers who capture a lot of data but can’t always authenticate the person making the transaction.
The insurance industry should be the leaders in the use of ‘new world analytics’. For hundreds of years insurers, through their actuaries, have been analysing data in relation to risk assessment and prevention.
Examples include analytics to predict when we are likely to die and how many home break-ins we might experience during our lifetime, with the predicted risks then factored into the premiums customers pay.
My educated guess is that while some progressive insurers are starting to see the opportunities provided by the big data problem, many need to move faster to embrace this new world of predictive analytics.
So what exactly is holding the financial services industry back from progressing?
I think the problems for the industry are multi-fold:
- There is a view in some banks and insurance companies that you should perfect the strategy before investing in big data projects. The fact is that we are all still learning. True, some are just learning more quickly than others but the trick is to be adaptive and to start, then regularly test and review your approach.
- I suspect some organisations feel it would be best to have their core platforms in place first. While I understand this will help reduce the complexity and inconsistency of data as well as the cost to extracting and analysing data from multiple sources, I don’t feel waiting is a practical solution. Regardless of the state of core platforms, organisations will have multiple sources of data for a while yet.
- There are just too many options out there for analytics platforms and solutions. Almost every vendor claims full capability and differention to its competitors. All of this is very confusing for IT decision makers. I am not personally convinced that any vendor has nailed this completely. All the solutions out there have good and bad elements and in my personal experience, there is no single technology to date that can cope with all the characteristics of big data, certainly not all at once.
- There could be some concern about what to do with all this new information and insight on customer behaviour. There is certainly a fine line between using data to genuinely help customers versus going too far and encroaching upon their privacy. While this is not a problem specific to the financial services industry, it is even more important to banks and insurance companies because of the inherent element of trust that is a part of their value proposition.
- Lastly, some organisations and vendors are focusing too much time and effort on analytics around unstructured data. Organisations are interrogating the chatter on Twitter, Facebook and LinkedIn hoping this will provide a key differentiator against competitors. Banks and insurance companies need to get the basics right first before embarking on analysing unstructured data. Getting volume and velocity right with structured data would be a huge leap forward from an end customer’s perspective.
So what should organisations do?
Firstly, organisations need to move more quickly and build big data capability.
Progressive organisations will already be working on their SMAC convergence strategies where analytics is a key component.
Organisations need to create a combined business and IT team to tackle the problem. When considering vendor solutions don’t wait for the perfect “silver bullet” IT platform. Pick a vendor you trust and a platform that provides the majority of what you need and then start to build out the capability.
To reduce complexity in your IT environment consider a platform or vendor you already have within the business. At the same time, focus on the capability you need to build so if that actually means you select a couple of IT platforms as you experiment then that may be OK for now.
Of course, with the concept of SMAC convergence in mind, you should consider the opportunity to leverage an analytics service through the cloud rather than building a high cost internal solution. Most of the bigger providers will offer this as an option.
While this may not be your preference because of the sensitivity of customer data, my view is that this will soon be the norm so I suggest planning for this at some stage.
Lastly, make sure that you put in place the right governance process across all your big data analytics activities. While you need to be agile, the last thing you want to do is to create complexity that will cause more issues at a later stage.
Jeff Jacobs is a Sydney-based technology executive and IT consultant with over 25 years experience in senior IT positions with AMP, Zurich, CBA and most recently as the CTO of Westpac.