Blogs

Making a Federal (Use) Case out of Big Data

April 23, 2013

Lately, many news outlets have been rehashing some of the same old stories about how our government agencies are drowning in paper, with pictures of Veteran’s Administration file rooms looking like outtakes from the “Hoarders” TV program, complete with sagging floors from the weight of the floor-to-ceiling files stacked upon each other precariously (literally and figuratively.) Snarky pundits posited that the VA should help stem the rising tide of recent veterans entering the ranks of unemployed by hiring some of them to digitize the files, many of which contained healthcare claims forms recently submitted by, you guessed it, recent veterans such as themselves.

Oh, if it were it all just that easy.

Shelves-of-file-folders.jpgAs if every thirty-second news story with a provocative question (“Why are these files sitting in six-foot piles and not digitized already?”) somehow was entitled to a thirty-second answer that provided a neat and easy explanation. While the technology certainly exists to digitize them upon arrival, how best to process them requires dependencies on a multitude of variables ranging from accuracy and validation to interoperability with a multitude of consistently shifting and evolving codes, indexes, standards and platforms, along with integration to related systems and personnel across agencies, payers, providers and patients.

And that’s just the paper files, which create newsworthy photo-op visuals at a moment’s glance (“Wow, how can a person even walk in that room?” “Looks like fire hazard.” Or in my case, like my tax return receipts.) But the broader issue isn’t just paper files of course, but the explosion of all types of information and how best to address and manage it.

So we have to ask, how can government agencies best manage and govern enormous volumes and types of information, while protecting citizen privacy and civil liberties? And how do we determine, measure and act on the value of that information? These and other vitally important questions are addressed in the report “Demystifying Big Data: A Practical Guide to Transforming the Business of Government,” by TechAmerica Foundation's Big Data Commission.

Within the immense volume, variety, velocity and veracity of data is valuable information that either did not exist or was virtually undiscoverable, but if effectively managed, could deliver truly transformational outcomes. Consider the key mission imperatives that government agencies are tasked with, the related opportunities and challenges that the current data explosion presents, and the value of how big data can best be applied immediately and long-term.

For instance, in healthcare, 80% of medical data is unstructured and includes provider notes and correspondence, diagnostic device data, EMRs, claims, finance, etc. The clinical value of organizing, analyzing and acting on the data to drive outcomes improvement, preventative behaviors and greater care efficiency represents a uniquely compelling opportunity at this very instant in history. For instance, take a look at this video where Wendy Soethe, Manager Enterprise Data Warehouse & Business Intelligence at Seattle Children's Hospital, describes how the hospital saw analytics performance gains from 10-1000 times its previous solution, and how the speedy query times from using IBM PureData for Analytics delivered answers faster. The result was tremendous improvements in physician access to timely and relevant data while they cared for patients.

Healthcare organizations are leveraging big data for greater insight into measurable metrics for care-coordination, population health management and wellness programs, including patient outreach for early intervention and preventive measures to not only improve care but also prevent disease onset.

Once big data is defined within the proper relevant context, a combination of real-world use cases, technical underpinnings and public policy reveal highly impactful best practices, lessons learned, and relevant starting points for moving forward. For instance, the TechAmerica big data report cites a current use of streaming technology from bedside medical monitoring devices in the neonatal intensive care unit (University of Ontario Institute of Technology) with big data including streaming and data analytics to detect infections in premature infants up to 24 hours before they exhibit symptoms. Another use case (CMS) for medical records analytics includes streaming data and analytics, distributed information management, and warehouse optimization to ensure data security and privacy of patient information.

IBM is making a significant investment in healthcare IT, including big data technology with the largest commercial research organization in the world, an investment of over $16B in analytics acquisitions since 2005, and managing global Analytic Solutions Centers visited by over 4,000 organizations worldwide. According to Steve Mills, Senior Vice President and Group Executive at IBM: “With the right technology and skills, together with our government, we can better address issues from health care to public safety to fraud detection, ultimately leading to improved outcomes for our country and citizens.”

Visit here for more healthcare posts