Blogs

Post a Comment

The evolution of complex event processing

March 27, 2014

A shift from queries to real-time actionable insight

Complex event processing: Roots and origins

The need for speedy actions and timely responses is of paramount importance across all industries, but the financial industry is in league of its own. The stakes are high for traders on Wall Street since fortunes can be lost in milliseconds.

To keep up in a highly competitive industry, financial firms needed to expand data processing capabilities beyond the traditional database where questions are asked, schemas designed, databases populated and finally reports delivered. Thus, complex event processing (CEP) emerged. Complex event processing is a general category of technology designed to analyze streams of data flowing from live sources to identify patterns and significant business indicators. 

Around the year 2000, CEP vendors were rapidly entering a hungry market because financial executives were quickly realizing that machines could out-perform humans when it comes to recognizing profitable transactions. By 2001, 20 percent of trades were computer assisted and in 2013, 66 percent of trades were being handled electronically, surpassing human traders!

Until recently, the growth of CEP outside of the financial industry was relatively moderate, with strong activity coming from the airlines for applications such as baggage handling. However, times are changing; business models are radically shifting and the volume, variety and velocity of data to be mined for business insight is growing exponentially across a broad spectrum of industries.   

With the onset of the big data era and demand for new systems of engagements with big data, CEP solutions should evolve. Leaders across all industries are looking for ways to extract real-time insight for their massive data resources and act at the right time. They need to dynamically update business rules with market conditions such as client sentiment or weather patterns.

Datagram_streams.jpg

Big data ushers in the next market opportunity

Faced with this growing volume of constantly changing data, organizations are challenged to make informed, real-time business decisions and stay ahead of the competition. From an architecture perspective, new technologies must:

  • Handle data asynchronously, which means that the architecture should facilitate multiple processes simultaneously on a single message or trigger
  • Analyze in-memory for high-speed applications and scale linearly up or down on the fly based on memory requirements
  • Shift towards co-locating data processing closer to data sources

To address these requirements, IBM introduced a new paradigm for real-time data processing called stream computing. Stream computing pushes or flows data through highly sophisticated analytics. Stream computing delivers real-time analytic processing on constantly changing data-in-motion. It enables descriptive and predictive analytics to support real time decisions. Stream computing allows you to capture and analyze all data, all the time, just in time.

Imagine a business decision that combines all information sources to render an action. Information could include: current event information, stable information about entities involved in the event, information about past events correlated to the current event and entity, information relating to the entity and current event and  information about the likely futures, derived from predictive models. This complex analysis is possible with stream computing. 

Use stream computing to address the following requirements:

  • Latency must be low: typically less than a few milliseconds, but sometimes less than one millisecond, between the time that an event arrives and it is processed
  • Volume of input events per second is high: typically hundreds or a few thousand events per second, but into millions of events per second
  • Event patterns to be detected are complex: such as patterns based on temporal or spatial relationships

Analyze. Decide. Act.

InfoSphere Streams is IBM’s stream computing platform. InfoSphere Streams provides an advanced computing platform to allow user-developed applications to quickly ingest, analyze and correlate information as it arrives from thousands of real-time sources. The solution can handle very high data throughput rates, up to millions of events or messages per second.

Can’t wait to get started? Try InfoSphere Streams Quick Start Edition today. This no-charge offering is available to you for experimentation and has no time or data limit. If you want to learn more about the evolution of CEP to stream computing, listen to this podcast with Roger Rea, product manager of InfoSphere Streams.