We all want to implement smarter analytics - to turn information into insights that we can use to improve both our personal and professional lives. Yet we are often faced with a fundamental, often nagging, problem of getting the right data for whatever analytics we are trying to implement. It’s
Optimality is the new nirvana. The promise of "next best action" is that, somehow, we can program the optimal automated response into every business scenario. Of course, this dream presupposes that someone in your organization can specify the optimal response for any scenario that your personnel
Game-changing analytics applications don't spring spontaneously from bare earth. You must plant the seeds through continuing investments in applied data science and, of course, in the big data analytics platforms and tools that bring it all to fruition.
If there’s more and more data arriving and time isn’t expandingi, then data must be arriving at greater and greater velocity.
In my last post I talked about Variety in the Volume, Variety, Velocity triumvirate. There’s more to be said about that, but first I’d like to take a run at Velocity. We’ve
The human condition is an unfathomable mystery, a complex stew of biological, genetic, behavioral, cultural, environmental, psychological, and spiritual factors.
But fathom it we must. When our personal condition stumbles from wellness to illness, we will use any resources at our disposal,
Multiple sclerosis (MS) is a chronic neurologic disorder that afflicts many in the primes of their lives. The biomedical research community has ramped up its use of big data analytics to illuminate the myriad factors that contribute to the onset and progression of MS.
On April 26, IBM announced
Further to news of SUNY’s exploration of big data to understand possible causes of multiple sclerosis, I spoke with David Smith, VP of Marketing at Revolution Analytics, for a briefing on some advantages of R for analysis of large data sets.