Optimality is the new nirvana. The promise of "next best action" is that, somehow, we can program the optimal automated response into every business scenario. Of course, this dream presupposes that someone in your organization can specify the optimal response for any scenario that your personnel
Here are the quick-hit ponderings that I posted on the IBM Netezza Facebook page this past week. Clearly, I was focused on the "big" side of big data, and on the "statistics" DNA of the analytics that power big data, and on the limits of what you can in fact "optimize" with big data and analytics
Game-changing analytics applications don't spring spontaneously from bare earth. You must plant the seeds through continuing investments in applied data science and, of course, in the big data analytics platforms and tools that bring it all to fruition.
If there’s more and more data arriving and time isn’t expandingi, then data must be arriving at greater and greater velocity.
In my last post I talked about Variety in the Volume, Variety, Velocity triumvirate. There’s more to be said about that, but first I’d like to take a run at Velocity. We’ve
Multiple sclerosis (MS) is a chronic neurologic disorder that afflicts many in the primes of their lives. The biomedical research community has ramped up its use of big data analytics to illuminate the myriad factors that contribute to the onset and progression of MS.
On April 26, IBM announced
Further to news of SUNY’s exploration of big data to understand possible causes of multiple sclerosis, I spoke with David Smith, VP of Marketing at Revolution Analytics, for a briefing on some advantages of R for analysis of large data sets.