Blogs

Distinguishing Data’s Latency From Its Actionability

Does in-memory persistence remove technological barriers that prevent business outcomes at any latency?

Big Data Evangelist, IBM

Doing nothing is an action, just as sitting still or moving with careful deliberation can be powerfully mindful responses in an otherwise frenetic world. Change isn’t intrinsically valuable in all circumstances. Where data analytics enters the picture, people often wield the concept of actionability to refer to circumstances in which this resource can serve as the sufficient trigger of human-instigated change.

In general, the action to be triggered is presumed to fall into the don’t-do-nothing category, and there’s a presumption that don’t do nothing falls into the real-time, urgent end of the latency spectrum. However, this presumption isn’t always accurate. Business decisions play out over many time frames, ranging from in the moment to monthly, quarterly, yearly, and beyond. Likewise, many business processes involve maintaining a steady state of established practices, repetitive processes, and reproducible outcomes. And the need for innovative, disruptive actions doesn’t consume every minute of every business day at every level in organizations.

The actionability of most operational business intelligence (BI) data in a data warehouse consists of its affirmation that the organization is continuing to balance the books, deliver product consistent with contractual terms, and keep the customer satisfied. My LinkedIn blog post from a few years ago discusses the actionability of data in real-time decision and action scenarios.1

Technological barrier removal

As the industry inflection point approaches in the trend toward all-in-memory architectures, we need to keep the distinction between data’s actionability and its latency clear. When discussing the value of in-memory architectures, the focus needs to be on the removal of technological barriers that have historically prevented end users from achieving value at any latency, not just in real time.

In an otherwise excellent recent report on the in-memory data platform market, Wikibon’s Jeff Kelly blurred that distinction. Unfortunately, he framed his discussion with this misleading sentence: “In order to derive business value from big data, practitioners must also have means to quickly (as in sub-milliseconds) analyze data, derive actionable insights from that analysis, and execute the recommended actions.”2

Really? Sub-millisecond data and decision latencies are the essence of big data’s actionability as a strategic business resource? What about the skyrocketing popularity of batch-oriented Apache Hadoop platforms? Are Hadoop implementers not deriving ample business value from their operational big data clusters? Are the world’s data scientists utterly wasting their time if they don’t have a real-time component to their boundary-pushing projects in predictive analysis, natural-language processing, and other areas?

I’m sure Kelly doesn’t fundamentally disagree with me on these points—I know because I read his research. But his decision to characterize in-memory, real-time, low-latency analysis as the essence of big data’s actionability is just wrong. Not only that, it fosters the false impression that any big data platform lacking in in-memory persistence is behind the times and on the verge of obsolescence. Furthermore, and this point is a minor scoping quibble, I take issue with his decision to place in-memory databases into a separate bucket from big data platforms. In-memory processing is a key enabler for the velocity v in the big data order of things, so it makes no sense to stick high-velocity platforms into a separate category from high-volume platforms. Doing so especially doesn’t make sense considering that many commercial platforms combine both of these features as well as the high-variety v in a unified architecture.

In spite of all that, I enjoyed Kelly’s overview of today’s in-memory data platform market. To his credit, he discusses IBM® DB2® with BLU Acceleration data management within the correct market context.3 However in the course of describing BLU’s value proposition, he inadvertently contradicts his thesis regarding the paramount role of in-memory databases in the big data value story. Specifically, he discusses the DB2 with BLU Acceleration performance and cost-effectiveness advantages within hybrid big data architectures. Kelly further observes that BLU enables performance-intensive applications to tap into frequently accessed data being held in memory while also facilitating access to lower-latency data residing on solid-state drives (SSDs) or rotating storage media.

And I agree with his contextualization of BLU Acceleration and rival in-memory platforms within a market that’s inching toward a technological tipping point. The plummeting cost of RAM is undeniably causing every platform on the market to evolve inexorably toward all-in-memory architectures for increasing numbers of use cases. However, that tipping point is still a few years in the future.4

Technological tipping point endurance

And assuming that technology push is a straightforward process with a predictable outcome can be risky. Technological tipping points don’t immediately revolutionize the enterprise architectures in which the game-changing platforms are adopted. Likewise, disruptive shifts in enterprise architectures don’t immediately transform the business culture that has taken root on older platforms. When considering all-in-memory architectures, their widespread adoption probably won’t cause companies to push real-time, decision-and-action cycles into the heart of every single business process.

Bureaucracy’s delays are not inherently a bad thing. Long decision-and-action cycles exist and can endure for many valid business reasons. They ensure that organizations follow repeatable processes, gain buy-in from important stakeholders, and don’t inadvertently rush those functions—for example, strategic planning, campaign orchestration, process checkpoints—that require sufficient time and careful deliberation. Real-time, quick-turnaround decisions aren’t necessarily the best decisions in all these circumstances.

Just because action can be taken at RAM speed doesn’t mean it should be. Catch your breath. Call a meeting. Exchange a few emails. And don’t feel compelled to check updates to the business-performance dashboard every waking moment. Resolve to do nothing at all, at least for the next few milliseconds.

Please share any thoughts or questions in the comments.

1Speed of Thought? Real Time Is Really Relative to Your Decision/Action Cycle,” by James Kobielus, LinkedIn blog, September 2012.
2In-Memory Databases Put the ‘Action’ in Actionable Insights,” by Jeff Kelly, Wikibon, December 2014.
3 DB2 for Linux, UNIX, and Windows website, Features and Benefits, BLU Acceleration at IBM.com.
4Storage Optimization? The Tipping Point Where SSDs Prevail Grows Blurrier By the Day (Part 1),” by James Kobielus, LinkedIn blog, January 2015.

[followbutton username='jameskobielus' count='false' lang='en' theme='light']
 
[followbutton username='IBMdatamag' count='false' lang='en' theme='light']