Measuring Confidence in Data
An IBM-commissioned survey provides insight into information governance implementation
How has the explosion of data from new and existing sources affected data governance plans? Are organizations advancing in data governance maturity? What level of confidence do managers have in the data available to run their businesses and support their decisions? Questions such as these prompted IBM to commission Unisphere Research to conduct a survey of over 300 IT and business managers.1
The results of the survey indicate that as big data and analytics proliferate, information governance—with its potential for helping organizations realize the benefits of big data initiatives—is often inhibited by organizational factors. As a result, organizations are missing opportunities both to grow their businesses and to reduce costs through operational efficiencies.
Information governance inhibitors
Asked about the maturity of information governance within their organizations, 44 percent of survey respondents said it was immature or somewhat immature, compared with only 28 percent who said it was mature or somewhat mature. One factor holding back governance initiatives appeared to be a piecemeal approach, because only 25 percent of organizations indicated that they developed a data governance plan for the year, across multiple projects. Individual projects in areas such as data quality or data security clearly can fit under a governance umbrella, but they were not being addressed as part of a bigger plan.
The survey revealed other obstacles that were preventing implementation of information governance projects. The top barrier, not surprisingly, was competing priorities—probably the top barrier to doing almost anything new in any organization. The second barrier, named by one-third of all respondents, was lack of a business sponsor. That reason, too, shouldn’t be surprising. Many governance-related projects—in areas such as data lifecycle management and data security and privacy—are enablers to multiple business initiatives rather than essential elements for just one aspect of the business with one clear executive owner.
The employee who is in the appropriate position to advocate for those projects is someone who has a broad view of data issues and opportunities across the whole enterprise. That individual may be the chief data officer (CDO) or someone without that title but with some of the associated responsibilities, such as a chief data scientist. However, only 16 percent of respondents came from organizations that have a CDO. Another 17 percent expected to add one in 2014, leaving two-thirds of respondent organizations with no clear data chief and no apparent plans to add one.
Too little analysis and too much maintenance
The study went beyond data governance itself to examine organizations’ ability to adopt new technologies such as cloud computing, social media, mobile applications, big data, and real-time analytics. Respondents found that all these technologies would have a positive impact on their business, but identified a wide range of barriers to implementation of new projects, such as organizational issues, which was a big concern for 57 percent of respondents. Other concerns included technology issues—50 percent of respondents—and process or change management—46 percent of respondents.
Although real-time analytics was the advanced technology the organizations expected to have the most positive impact on their organizations, 36 percent of respondents said they were spending too little time actually analyzing data. Instead, they were spending too much time and too many resources finding data, validating it, and sometimes defending the data and the related analysis.
Organizations were not able to spend as much time as they would like moving ahead with new technologies, in part because they were spending so much time and money maintaining their existing applications. In fact, 30 percent of respondents indicated that maintenance was consuming more than 50 percent of their resources. Reducing the resources required for application maintenance appears to be key to enabling organizations to meet their business objectives by initiating new growth initiatives.
A shortage in data confidence
Despite the importance of advanced analytics to their organizations, respondents reported a shortage of confidence in the data on which analytics would be based. There was general confidence in structured data from internal systems—63 percent of respondents gave it a high rating. However, there was significantly reduced confidence in unstructured information, data from social media, and data stored in a public cloud, which received no-confidence votes from about half of the respondents (see figure).
Categories of significantly reduced confidence in data
Given the changing mix of available information and the rapid increase in information other than structured, internally sourced data, the overall confidence level might be expected to decrease over time. This decrease could leave both IT leaders and business-decision makers wondering how to tap into the new data sources and determine when it is and is not appropriate to use the new data for critical insights.
Data confidence calculation
In the past, determining what level of confidence was required for data to be used for a particular business activity, and then mapping that requirement against actual confidence in the data, was something that rarely occurred. When it did happen, it was more a seat-of-the-pants guess than a calculation based on a repeatable process. A new Information Confidence Calculator2 introduces an advanced level of rigor into the process. The Aberdeen Group report, “The Information Confidence Calculator: Measuring Trust in Big Data,”3 provides details about this calculator. It helps organizations score and visualize trust in customer data used for different types of decisions. As discussed in the report, a calculation takes into account seven key factors:
- Quality – Accuracy of the data
- System integrity – Number of systems and level of data consistency
- Completeness – Percentage of records without missing fields
- Currency – Timeliness of data
- Lineage – Data source from an externally validated source or a trusted internal system
- Security – Number of recent data breaches
- Governance – Adherence to policies for data access and control
Survey results alignment
Are organizational issues creating barriers to effective information governance in your organization? Is the organization spending too much time finding data rather than analyzing it? Does the level of confidence in the expanding sources of data available to the organization align well with the proposed uses for the new data? Please share any thoughts or questions in the comments.
1 “Government Moves Big Data from Hype to Confidence,” by Elliot King, Unisphere Research, a division of Information Today, Inc., sponsored by IBM, June 2014.
2 Information Integration and Governance, IBM.com microsite with information on and a link to the Information Confidence Calculator.
3 “The Information Confidence Calculator: Measuring Trust in Big Data,” by Peter Krensky, Aberdeen Group, IML14423USEN, April 2014.