Blogs

How to deliver actionable insights from data streams

Post Comment
Product Marketing Manager

Sometimes two minutes is too late. For time-sensitive processes such as thwarting fraud, mitigating security threats or responding to natural disasters, time is of the essence. Stream computing enables continuous processing of data streams and can be used to maximize the time value of data.

Plenty of organizations are seeing tangible return on investment (ROI) when using solutions to address the low-latency requirements of their big data challenges. For example:

  • Healthcare firms can realize 95 percent faster insight into patient health by accelerating execution of complex algorithms. This saves lives by flagging the risk of serious medical conditions. It also enables targeting of patient care effectively, thereby optimizing health care resources. 
  • Telecommunication companies can improve marketing campaign results by more than 70 percent by using an enterprise-wide analytics platform that has become the foundation for contextual marketing.
  • Utilities can save more than 700,000 gallons of fuel. For example, CenterPoint Energy, powers 2.3 million smart meters with IBM InfoSphere Streams and has lowered costs for consumers by $24 million.

Advances in stream computing shift the conversation from how to manage big data to how make sense of, analyze and act on it at high velocities. The Internet of Things (IoT) will drive much of this worldwide shift toward low-latency streaming analytics. Showing how rapidly the IoT is coming into widespread adoption, Gartner Group has predicted that by 2020 there will be 25 billion connected IoT endpoints in use worldwide. Stream computing can help make sense of IoT and other fast data in this new cloud-centric world. 

Stream computing for industries

Stream computing is unique in many ways. It is a processing paradigm that brings the analytics to the data, rather than storing the data first. In-memory processing helps speed up response times.

Most stream computing platforms include two core components—an application development environment to build applications that ingest and process data streams, and a runtime capability designed to process data streams with low latency at massive scale seamlessly across infrastructure. Data engineers and developers use these tools to build streaming analytics that is typically a competitive differentiator. Why? Because when companies can analyze all their available data, rather than a subset of it, they gain a powerful advantage. Organizations already making use of stream computing to address specific business problems have experienced a rapid ROI.

Many industries, including government, telecommunications, healthcare, energy and utilities, finance, insurance and automotive, are now finding the unlimited potential to harvest all data, all the time. Stream computing analyzes data in motion for immediate and accurate decision making in various industries.

Simplicity for IT administrators and developers

A key difference between stream computing and online analytical processing is that the latter requires data to be at rest before running analytics. By contrast, IBM stream computing supports ultralow-latency analysis of data in motion. And unlike OLAP, it does not store data on disk.

Real-time analytics or stream computing can be a quite complex environment. Ease of use capability can motivate developers because it takes them less time to learn new features. Discover how stream computing offers a development platform using a scale-out architecture and includes comprehensive tools for developing and managing the environment. The development environment also includes a set of toolkits that provide high-level functionality to accelerate development. Developers can confidently expect that all data is processed and define streams regions to help avoid data loss in applications.

Data streams for continuous analytics

Streaming analytics toolkits improve the productivity of developers and data scientists in crafting complex analytics, such as natural language processing, voice analytics and facial recognition. For example, InfoSphere Streams is comprised of several toolkits that include telecommunications event data, time series, text, messaging, databases, geospatial data and more. Many of these toolkits are part of the InfoSphere Streams open source project.

Stream computing platforms process data in motion, and InfoSphere Streams provides open analytics toolkits to help jumpstart applications and reduce the complexity of data science. To learn more, check out this InfoSphere Streams data sheet, "A new paradigm for information processing."

This post was coauthored by Kimberly Madia, worldwide product marketing manager for InfoSphere Streams at IBM.