The Role of Data Quality in a Big Data Environment

How can we ensure the quality of big data? Big data, in its constant growth, relies on massive volumes of data that come from inconsistent sources, with ambiguous lineage and uncertain data currency. This has created one of the greatest challenges in today's big data environments.

Integrating structured and unstructured data unveils challenges for consumers of business analytics. But can conventional data quality practices meet the needs of the analytical community? How can we overcome these challenges and ensure data quality in this ever-changing world of big data?

On Wednesday, November 28, IBM's Tom Deutsch will join David Loshin of Knowledge Integrity to present a webinar on the role of data quality and how it can be achieved. In this TDWI webinar, attendees will learn how data quality can deliver value in big data environments. Gain insights on the critical data quality dimensions for big data, information streams, quality assessment, metadata, concept variation and more.

Hear from two experts on the best practices to meet these needs and challenges of the analytical community. Start delivering value to your big data environment and register today!

Get details and register for the TDWI webinar

Also, be sure to follow both Tom and David on Twitter: @thomasdeutsch @davidloshin