Rachel Bland, senior product manager for IBM Business Analytics, describes one of the many sessions on Analytics at Information On Demand 2013. This session will focus on IBM's latest in-memory technologies and how we're making them easier to use. Key points of this session are:
One of the biggest problems posed by big data is separating the signal from the noise, or cutting through all the data to find insight and value. The 2013 IBM Institute for Business Value study surveyed 900 business and IT executives from 70 countries to assess how they’re converting data into
How do you find diamonds in your data? That’s a major focus of most big data and analytics projects. Everyone wants to more effectively sift through piles of dusty data and find the value hidden within. You know the type: gems that help answer key questions about your customers. You’re searching
Traditionally, the amount of available memory limited how businesses could – or couldn’t – run complex queries or analytics. Disk space was a constant challenge, and architects and DBAs performed a continual balancing act, trying to put the most valuable information in the most accessible, fastest
The sheer amount of data your business deals with continues to grow. And you have to make sure that data is handled properly through a solid data governance program to keep it clean, accurate and accessible to the right people.
Maybe you’re lucky enough to hire a couple more employees to help
Quick: how fast do you need to understand your data?
There’s really no good, universal answer. As big data becomes more about using data to actually generate insights and understanding, we’ve begun talking about “speed of thought analytics.”
BLU Acceleration helps make this possible by integrating
Paul Zikopoulos talks with Master Inventor Sam Lightstone in this video called "Actionable Compression in DB2 10.5 with BLU Acceleration." BLU compression allows you to work on the data while the data is still compressed. This changes the game! Find out more by watching the video.
In this FAQ, IBM Fellow Tim Vincent and IBM VP Paul Zikopoulos discuss the technical genius of DB2 with BLU Acceleration. They discuss how columnar, simplicity, actionable compression, core-friendly parallelism, vector processing or SIMD, cache friendly, and data skipping provide synergy to
This episode features two experts with very different perspectives talking about the intersection of big data, analytics and the databases that make it all possible. IBM Champion David Birmingham is a Senior Principal with Brightlight Consulting. He has more than 25 years of experience in very –
As your data grows, how’s your data footprint? Have you outgrown the widest shoes you can find?
Compression offers plenty of methods to help reign in your data, controlling your data footprint and all the associated costs. But compression techniques have evolved. Have you kept up?
Martin Wildberger, IBM VP of Information Management Development, takes us through the extraordinary innovation behind the creation of BLU Acceleration. He kicks off a new video series that will introduce you to key members of the BLU Acceleration development team and share insider tips on improving
Big data need not mean bigger complexity. But we often hear from people dealing with sprawling, messy, complex infrastructures who fret that each new system or data set seems to add additional complexity and chaos.
In many cases, the very BI tools that promise to shed light on your data really just
IBM recently announced DB2 10.5 with BLU Acceleration, which uses a combination of columnar data store, memory optimization and hardware exploitation to improve performance for analytic workloads. Today, we have Phil Downey, worldwide program director for DB2 for Linux, Unix and Windows, joining us