Customer experience data is too important to foul with shoddy governance
Customer experience is deceptively qualitative. Your customer engagement channels continuously produce big data that describes every aspect of that experience. To the extent that you don't measure, manage and govern that data as a key business resource—and analyze it with every quantitative tool in your portfolio—you risk losing touch with customers.
Data governance is the price you pay to maintain the value of experience data, or any other subject domain, as a precious business resource. When big data is everywhere, the negative business impacts from poor governance are also ubiquitous. How should you measure the bottom-line consequences of slipshod stewardship? Or do you need to measure it at all? Are dirty data's downsides so self-evident that there's no need to quantify the return on investment (ROI) from good governance?
Experience data tends to get overlooked when the discussion turns to governance, largely because it's so diffuse and is not traditionally linked to the customer data record. Where this, or any other subject domain, are concerned, you can never take quality for granted. The governance practices to track and enforce it aren't automatically instituted by some invisible hand. Also, the proliferation of new data platforms can wreak havoc on governance practices that may have worked well in the tidier, more well-defined world of enterprise data warehousing.
As experience-focused big data initiatives take hold in your organization, a solid, quantifiable business case can help you justify the necessary investments in personnel, processes, and tools for strong governance. To build support for this case, it might become necessary to invoke the specter of unfortunate "compelling events." According to Henry Olson "events such as security breaches, regulatory actions, financial misstatements or systems failures can have a significant and measurable cost after they occur." In such circumstances, the ROI of good governance is usually framed in reactive damage-control terms.
As emergencies fade in memory, organizations, like individuals, tend to let their newfound data-governance discipline fray around the edges. Under those circumstances, says Olson, justifying big data governance as a continuing investment demands ROI metrics that proactively address what he calls "value creation." Though he spells out several important ROI metrics for making the case—accuracy, speed, efficiency, effectiveness, agility, quality and reputation—he oddly omits metrics that are directly linked to quality of customer experience.
Satisfaction, sentiment and loyalty are the key quantitative metrics of experience, and they are often key to big data business initiatives. These same metrics can even give you the justification you need to tighten governance around otherwise governance-lite experience-relevant sources, such as social listening, smartphone and portal clickstream data. As I have stated before, you should apply "strong governance to data that has a material impact on how you engage with customers." Social sentiment—or geospatial, mobile, event and other new data sources—will factor directly into your multichannel engagement initiatives. If nothing else, they'll provide the raw coordinates for you to continuously recalculate and guide the customer experience.
Where experience is concerned, you need to read between the lines in Olson's article to find a suitable ROI metric. For example, if you let experience data go bad, you'll face consequences in the "operational efficiency and effectiveness" metrics that he discusses. Also, high-quality social and mobile data will help online retailers to tune their engagement analytics to address Olson's "elimination of inaccurate pricing and offers," "improvement in demand management" and "improved campaign targeting and delivery" metrics.
Around the same time I came across Olson's blog, IBM announced our new consulting practice to help customers address experience management within a comprehensive engagement strategy. The IBM news release highlights three key experience-focused algorithms that would benefit greatly from high-quality social, mobile and other new customer data sources:
- Life event detection: analyzes unstructured social media data to detect important events in customers’ lives. For example, this technology can assess specific life events, like a marriage, and then make correlations to a range of financial decisions.
- Behavioral pricing: combines behavioral models on consumer response to pricing, such as surprise and thrill of a deal, with historical transaction data to help retailers design personalized pricing strategies that help consumers make purchasing decisions and improve their experience.
- Psycholinguistic analytics: combines the psychology of language with social media data to understand inherent personality traits of individuals and identify their preferences. This technology goes beyond generalizations and recognizes the individuals to identify how they prefer to receive and consume information and offers.
All of these analytics tie directly to the notion of a "720-degree view" of the customer's inner journey. Bad experience data can cause these algorithms to "mis-personalize" badly, and thereby worsen the very experiences you're trying to improve. Experience is optimized in the particulars of each customer's specific journey, hence, it is exquisitely sensitive to the quality of the 720-degree view data as linked to each customer's record and profile.
Customer churn, actual or potential, can be the compelling event you highlight in order to make your case for stronger stewardship over the 720-degree view. Or you can focus on the value creation associated with improved satisfaction, upsell and renewals.
Remember: always to use whatever compelling experience metrics can help you to make the best case for strong stewardship.