Blogs

Post a Comment

Big Data for Banking: Depends on a Scalable, Extensible Information Foundation

July 26, 2013

The promise of achieving significant, measurable business value from big data can only be realized if organizations put into place an information foundation that supports the rapidly growing volume, variety and velocity of data.

As part of the recently published global research study, “Analytics: the real world use of big data in financial services,” my colleagues David Turner, Michael Schroeck and Rebecca Shockley asked respondents with current big data projects to identify the current state of their big data infrastructures. Only slightly more than half of banking and financial markets companies reported having integrated information, although 87 percent say they have the infrastructure required to manage this growing volume of data (see figure below).

banking-IBV-3.png

The inability to connect data across organizational and department silos has been a business intelligence challenge for years, especially in banks where mergers and acquisitions have created countless and costly silos of data. This integration is even more important, yet much more complex, with big data. About a third of bankers reported Hadoop and stream computing pilots underway, and market activity suggests the pace continues to pick up. Where bankers lag, such as in the use of NoSQL engines and analytic accelerators, reflects the strong skills already in place based on the industry’s long history with business intelligence (e.g. SQL programmers) and quantitative modeling.

In other key big data infrastructure components, such as high-capacity warehouse, columnar databases, security, governance and optimization engines, banking and financial markets companies are mostly on par with their cross-industry peers.

NYSE Euronext, a prominent global stock exchange company, employed big data analytics to detect new patterns of illegal trading. It implanted a new markets surveillance platform that both sped up and simplified the processes by which its experts analyzed patterns within billions of trades. (Read the NYSE Euronext case study.)

“Everything we do is about analyzing information and looking for a ‘needle in a haystack’,” says Emile Werr, head of Enterprise Data Architecture and vice president of Global Data Services for NYSE Euronext. “We currently process approximately two terabytes of data daily, and, by 2015, we expect to exceed 10 petabytes a day. So we must select the appropriate technologies to analyze these huge volumes in near real time.”[1]

NYSE Euronext reports the new infrastructure has reduced the time required to run markets surveillance algorithms by more than 99 percent and decreased the number of IT resources required to support the solution by more than 35 percent, all while improving the ability of compliance personnel to detect suspicious patterns of trading activity and to take early investigative action, thus reducing damage to the investing public.[2]

In our next blog, we’ll look at how most early big data initiatives focus on internal data.

To learn more

Read the research report Analytics: the real world use of big data in financial services

Part 1 of Bob's series: Looking at New Research on Big Data in Financial Services

See more blog posts, videos, podcasts and reports on banking



[1] IBM Software: Smarter Commerce. “OCBC Bank netsprofits with interactive, one-to-one marketing and service.” July 2012.

[2] Ibid.