Saving Lives at 1,000 Data Points Per Second

Ph.D., Big Data Solutions, Healthcare and Life Sciences

The University of California, Los Angeles (UCLA), recently announced their work to develop a bedside early-warning system for brain pressure in traumatic brain injury patients (see video below).

At the core of this system is IBM’s streaming analytics software, InfoSphere Streams, which can ingest and analyze, in real-time, huge volumes of fast-moving data – the very kind of data that streams off of patient monitoring equipment.

Why is this so cool?

I wrote awhile back about how the University of Ontario Institute of Technology (UOIT) developed a system to predict infections in newborns in the intensive care unit (ICU) 24 hours earlier than previous methods.

One of the questions I had was: what other physiological parameters can we measure in real-time, as the data flows off the instruments, and analyze on the fly to provide better healthcare? Could we care better for patients with brain development issues, stroke, Parkinson's, or perhaps Alzheimer's?

UCLA’s work is one answer to my question. Their system measures the changes in intracranial pressure, in real-time, before it becomes lethal. Current protocols for measuring intracranial pressure are manual and involve sometimes one calculation per hour, or slower – not ideal to capture critical situations. Therefore, this system can significantly affect the way UCLA cares for patients with traumatic brain injury.

It’s not about thresholds

At HIMSS a few weeks back, I saw an interoperability showcase where there was a mock-up of multiple ICU monitors being fed into a single system, much like UCLA and UOIT has. But I was shocked to find that the system showcased was only recording threshold alerts, nothing more.

Threshold alerts only indicate when a measurement is over or under a set value. Thresholds do not tell if a measurement is trending in a way that may indicate the patient is improving or getting sicker. Such a system cannot produce complex and critical alerts in real-time, such as those calculated by UCLA, or predict physiological conditions, such as the system at UOIT.

Patients are giving this data away freely, and we owe it to them to learn from that data and do something with it to improve the care we give them.

This is not a technology story, but a real story of being able to do something we haven't been able to do before, to save patients from pain and suffering in ways that were previously impossible.

What's more, what UCLA and UOIT are doing is on the forefront of a larger wave in medicine around devices, consumer electronics, and health monitoring. Indeed, in his keynote at HIMSS, Eric Topol (author of "Creative Destruction of Medicine") spent some time on consumer health monitoring devices, and how the ability to aggregate and analyze all that data will be critical for improving outcomes, reducing costs and increasing access to care.

What’s next?

We have other hospitals who are using similar real-time systems in different areas, and we hope to tell you about them soon. I truly believe that this is leading-edge work. Sensors are going to be a part of our care, and these hospitals are showing how.

What do you think? Have you seen a predictive system that can analyze ICU monitor data in real-time and help patients? What do you think of the fusion of sensors, big data and healthcare?

Let us know in the comment section below!

And don’t forget to read the full news “UCLA Relies on Breakthrough ‘Big Data’ Technology from IBM To Help Patients with Traumatic Brain Injuries”. And watch the video:

For more examples of big data in healthcare: