When asked late last year where their organizations were going to be focused this year, Chief Information Officers (CIOs) told Gartner they wanted to spur growth through digital initiatives (22%), revenue/business growth (22%), operational excellence (13%), customer experience (8%); cost optimization (8%).
As you think about technology and digital efforts at your bank, one area that often surfaces is the value and promise of big data. After all, banks have tons of data, so why not figure out how to use it better and faster to drive growth.
Big data in theory can be tapped to give banks' customers exactly what they need at the time they need it. The question is whether we are there yet or not. In short, we are not, but we are making progress.
Look for instance at a survey by ZS in conjunction with The Economist Intelligence Unit. It polled 450 US-based senior executives and found that while 40% have put in place cloud-based big data infrastructure to analyze both structured and unstructured data from social media and other sources, only 8% have been able to actually integrate the infrastructure with their analytics capability. Moreover, a mere 2% feel that their predictive analytics capabilities, particularly for sales and marketing purposes, have had a "broad, positive impact."
After digging a bit deeper to find out why this is the case, we see the primary reason is that the information needed to make the best predictive decisions still predominately rests within siloed databases. Many of these are still in legacy systems too, so that bogs things down even more.
Community banks as with all banks are slowly making progress in collecting data from their myriad of systems. Once collected and cleaned, the data can then be better analyzed and relied upon. Of course, there are many challenges to this.
A quick look at work done by MasterCard and the Harvard Business Review looked at the obstacles executives said were preventing them from making the best use of their data. It found issues included: a lack of integration (53%), staff lacking skills (45%), siloed analytics (41%), ineffective deployment (39%) and data coming in too late to make decisions (31%). These and others are also typical at community and pretty much all banks at this point, so constant work is required to improve.
For banks that are working to do so, but saddled with legacy systems, the move to tinker with your data often begins with an application programming interface (API). APIs are like cords that can plug from one system into another to move data back and forth as it is analyzed, updated or otherwise modified even.
Of course, all community bankers know getting and controlling decent pools of data is very challenging. There are many different "owners" of the databases already being used by the bank and all must agree to communicate their respective data sources in the same "language" to enable true integration. This is an ugly process often; but once it is nailed down, you can get a decent flow of data going that is also pretty clean.
All things data sure seem to take a long time and they need consistent nurturing along the way. Don't get discouraged though. We will eventually get there.