Blogs

What Makes In-Memory Computing So Next Generation?

IBM DB2 with BLU Acceleration intelligently and rapidly processes data at the speed of thought

Director of Offering Management, IBM Analytics, IBM

“Real time” used to be the demand, but today’s organizations want everything now. Information has become so critical to operations and key decisions that businesses want answers at the speed of thought. If they can access vital information even faster than is currently possible, they can up the ante on gaining a competitive edge, doing more, and delivering greater value than ever before. And who doesn’t want that?

For this reason, in-memory computing has become the new go-to technology for analytics. In-memory computing accelerates data analysis by putting the active data set in memory and eliminating the latency of moving data across slow-spinning disk storage. The first generation of in-memory solutions required that the entire data set fit into memory, which is a costly proposition when the data set is typically sized in the range of multiple terabytes.

Agile in-memory operations


IBM envisioned in-memory technology on a highly grand scale: in-memory speed without the limitations, in-memory functionality with a high degree of efficiency, and in-memory power with unparalleled ease of use. BLU Acceleration is the next generation of in-memory computing that marks the end of coffee break analytics because it’s fast, simple to use, and agile. Organizations no longer have to wait for answers they need to advance the business. They can determine risk faster than past-generation solutions, identify trends ahead of their competiton, and set a course for the future. (For more information, visit the IBM BLU Acceleration hub).

IBM BLU Acceleration hub

Four key principles distinguish BLU Acceleration as a next-generation, in-memory computing technology.

Lightning-fast processing without requiring the entire data set in memory

BLU Acceleration is designed to process at lightning speeds even though it does not require having the entire data set placed into memory. Instead, BLU Acceleration uses a series of advanced algorithms that nimbly handle in-memory data processing. In addition, it has the capability to anticipate and prefetch data before it’s needed and to automatically adapt to keep necessary data in or close to the processor. Additional processor-acceleration techniques enable incredibly fast, highly efficient in-memory computing. And an agile design point allows an organization to leverage in-memory processing without constraints.

Cost-effective, efficient operations on compressed data

To enhance efficiency, BLU Acceleration cost-effectively operates on compressed data. Instead of the extra steps that waste time and processor resources to decompress data, analyze it, and recompress it, BLU Acceleration preserves the order of data and performs a broad range of operations. These operations—including joins and predicate evaluations—use compressed data without requiring decompression. This next-generation technique helps speed processing and avoids resource-intensive steps. After all, why cancel out the savings derived from compression, if uncompressing the data is necessary to obtain its benefits. BLU Acceleration does not require a trade-off between cost-effective compression and time efficiency.

Intelligent data processing that skips data not needed to generate answers

When processing massive data sets, chances are good that an organization doesn’t need all that data to answer a particular query. BLU Acceleration can intelligently skip processing unnecessary data to generate answers rapidly. It employs a series of metadata management techniques to automatically determine which data does not qualify for analysis within a particular query, enabling large chunks of data to be skipped. This agile computing technique offers enhanced system hardware efficiency. In addition, the metadata is kept updated on a real-time basis so that data changes are continually reflected in the analytics. Analyzing a subset of the data translates to quick, simple, and flexible in-memory computing. Intelligent processing helps save what is very likely the most precious resource in any organization—time. Time-saving in-memory processing enables organizational resources to focus on the business, rather than waiting on query responses.

Easy-to-use data management from a single system

Because business users demand more analytics faster than ever before, they need in-memory computing that can keep pace. BLU Acceleration is designed to be simple to use, and helps deliver optimal performance out of the box; there is no need for indexes, tuning, or time-consuming configuration efforts. Row-based data is simply converted to columns, and queries can be run. Because BLU Acceleration is delivered seamlessly integrated with IBM® DB2® data management, both row-based and column-based data can be managed from a single system, which helps reduce complexity. And when organizational technical teams spend less time on routine maintenance, they can participate in innovative projects that add value to the business.

Next-generation in-memory computing

BLU Acceleration delivers fast, simple, agile, next-generation, in-memory computing that can equip organizations to use actionable data for growing revenue, identifying efficiencies, spotting opportunities, and pinpointing risks. And now client organizations that deploy DB2 can take advantage of BLU Acceleration technology by upgrading to DB2 10.5. For more information, please visit the IBM BLU Acceleration hub. And please share any thoughts or questions in the comments.

[followbutton username='nancykoppdw' count='false' lang='en' theme='light']
 
[followbutton username='IBMdatamag' count='false' lang='en' theme='light']