Blogs

Enhancing risk data management with utility and refinery models

WW Lead, Wealth & Asset Management Investment Analytics, IBM Risk Analytics

Does your organization need a chief reputation officer (CRO)? Jennifer Janson thinks so. In her recently published book, The Reputation Playbook (Harriman House, 2014), Janson advises CEOs on how to protect corporate reputation in the digital economy. And she suggests the chief reputation officer, perhaps in conjunction with the chief risk officer, own the responsibility to identify gaps between corporate values and behaviors. In other words, if an element of the organization is not performing as expected, it should be determined whether the behavior is an anomaly or representative of a larger problem within.

Managing corporate reputation bears many similarities to how risk is managed in an organization. Distinguishing factors that are acceptable within the context of the overall business model versus systemic problems that require a course correction is crucial for risk managers. More narrowly, there are many parallels with how data is managed for financial risk management. When an analytic changes on the surface, drilling down to the source and identifying the cause is vital. Given the growing levels of collaboration between the risk function and business units, and the expanding expectations of regulators around the use of data, the need to quickly assess risk analytics across business units is increasing in importance.

A risk data management platform with a centralized utility function at its core, wrapped within an agile refinery function, delivers a data management system designed around the unique nature of risk data. The utility function provides a home for those standardized processes used on an ongoing basis, while the refinery function spins off products for more specialized inquiries further downstream. Processes assigned to the utility and refinery functions will vary, based on the unique IP, legacy data models and strategic focus of individual firms. All organizations, however, can benefit from the ability to quickly get to the source of surface changes and respond to inquiries from stakeholders within and outside the organization at Internet speed.

Risk data, reputation and regulation

Following the financial crisis, Reto Kohler, managing director and head of strategy for investment banking at Barclays, became aware of how different stakeholders could impact the bank’s reputation. In Deloitte’s global survey of reputation risk, Kohler notes: “I would put customers and politicians and the press on one side and regulators on the other. I think with regulators it’s almost easier because. . .you know what you need to do to comply with the regulations; whereas, reputation risk with customers and politicians and the press is much more fluid.”

Yet organizations have learned that knowing what regulators want doesn’t always make it easier to meet their requirements. With the recently finalized BSBC 239 Fundamental Review of the Trading Book (FRTB), concerns regarding risk data, data integrity, analytical challenges and the ability to fulfill requests for information quickly are clear. Many organizations are challenged by the FRTB’s expectations because their data management systems were built around and on top of legacy processes and systems, often resulting in bespoke processes spaghetti, with data copied to other places with little governance. The more self-contained processes that exist within an organization, the harder it is to locate and drill down to the underlying data.

Managing data for risk management is not a single process. It is a combination of standardized functions and more specific processes that may require a high level of customization and engagement downstream, with shared access to unaltered source data upstream. When a data management system is constructed to fit these requirements, such as the utility-and-refinery model, the risk function can quickly get to the source of changes in analytics while maintaining the highest levels of data integrity. When these inquiries are addressed reliably and in a timely fashion, the CRO gains confidence from risk takers in the organization. The reputation of the risk function is elevated, and can be leveraged to establish and evolve an appropriate risk culture.

Separate yet equal

The first step in establishing a utility-and-refinery data management system is to identify and thoroughly understand the processes and business requirements within the risk management function. Tasks that can be standardized and handled in a more rigid fashion are processed as part of the centralized utility function. Such processes include retrieving, validating and mapping raw market data, and product and pricing information. All stages and data layers of the utility function are linked across a single architecture to ensure easy access to products at each stage of the refinery process. The production of a risk report for regulatory purposes, time-series statistical analysis or portfolio construction analytics are all products in their own right accessed at various stages of the refinery process.

The utility-and-refinery model also takes into account the importance of time series data. Because the risk function operates on an ongoing basis and these elements may be used on an intraday basis, the way in which risk data is managed becomes crucial to ensure the system operates as efficiently and accurately as possible. Centralizing the data utility and associated standardized processes, while preserving the correctness of time series data, can close the loop between the investment decision process and the data management process. Exposure analytics would, for example, highlight the major investment exposures faced by an organization and the related risk factor impact, so that fund managers could match investments to strategic mandates with greater confidence.

Linking different business functions also allows for more consistent and efficient reporting. While regulations need to be addressed independently, they often overlap greatly in their underlying data requirements. By clarifying where various data layers come from and how they interact with each other, the centralized utility helps avoid the reproduction of existing reporting objects for different regulations. The utility function may present reporting options to end users as a series of widgets or applications, leading to more efficient and consistent reporting.

When questions arise about changes to a particular analytic or the accuracy of results from internal stakeholders, regulators or public inquiries, the risk function can get down to the source quickly. With the ability to link different data layers, users can validate and correct data elements with in-depth, transparent and auditable results. The framework also lends itself to proactive inquiries. A summary of data changes and potential impacts can be accessed prior to initiating an investment strategy, enabling business users to gain a better appreciation of end results and thereby increasing efficiency.

The always-on factory

The utility function is intended to work in the background, predictably and reliably completing processes higher up the data stream. As an always-on factory in the data management chain, the utility function runs automated processes that require minimal intervention. These processes may include extracting, normalizing and ensuring the right categorization of data. Users would not be expected to interact with utility functions processes. However, they can benefit from having access to a set of refined value-add functions. These functions would be made available through a visualization-based interface to initiate high-level investigations, validations and corrections.

When a new business requirement or regulation appears, relevant processes tied to the utility function can be updated without having to relocate or re-scrub risk data. As a result, process modifications can be made more accurately, reliably and faster than solutions designed to work with legacy systems and the layers of processes put in place by different generations of personnel.

The trade-off between availability and customization is one that an organization will have to determine for each of its processes. Relinquishing some degree of customization is often a fair price for having the utility own and operate tasks in a stable, predictable way. Should the organization’s IP, data or client requirements preclude a process from being standardized, it would be moved further down the data stream.

The agile refinery

Risk management processes support multiple business functions, which require different levels of frequency, insight and justifications. In the agile world of the refinery, users from these individual business units access centralized risk data through highly customized outputs. These outputs can be determined by the business function, client requirements or the organization’s IP.

A disparate group of end users can safely and simultaneously leverage risk data from its source. Quants, risk teams, fund and portfolio managers, performance and ex post facto performance can analyze the same data from the specific context of their own customized outputs. These analytics, although specific to the business unit’s demands, would be based on a consistent underlying data utility model, supporting quick response to queries and on-demand decision support.

Visualization tools are also used to support the refinery function, in which users are seeking answers to specific questions that could involve drilling through multiple layers of data. Visualization tools help users explore and share this data in a meaningful way by turning abstract information into a graphic format. These at-a-glance type findings are helpful for encouraging exploration, raising awareness and sharing results that are easily understandable across multiple business lines.

Cost and confidence benefits of data ownership

Within many organizations, who ultimately owns risk data remains unclear. This lack of clarity adds a level of uncertainty each time a risk factor is introduced or changed, as its associated raw data components need to be tracked down. In the utility-and-refinery model, existing data components are easy to locate, with a clear picture of how the data is used by the centralized functions and customized outputs. This structure attaches a predictable level of cost certainty to evolving requirements. Should specific raw data components not be available internally, they may be acquired from third-party data vendors. Acquiring data in this fashion presents as an appealing solution, but the process introduces additional factors that could result in misrepresented exposures and analytics.

Risk data can be compromised by how it is entered, exposure to certain models and a host of variables that may affect the analytics and processes of customized outputs. In many cases, third-party data providers may not be able to confirm which processes their raw data has been exposed to, preventing organizations from being able to locate the source of change. The data provider may have dedicated staff to answer questions, but not necessarily in a risk-management context.

Potential misinterpretations can be avoided by feeding third-party data into the centralized utility function, with personnel in place who understand the specific nature of risk data. Through this workflow, when a user of a refinery process quickly needs to get to the source and has questions about the source data, the user can immediately reach someone who speaks in a risk context and understands all the ways that data is being used throughout the organization.

Further cost certainty can be acquired through the availability of cloud-based services. Clouds can seamlessly integrate new data requirements, easily scale up and down to accommodate changing storage needs and are as close to an agnostic-free platform as one can imagine. As the volume and complexity of data management for the risk function continues to grow, the option of cloud-based storage, whether as a hybrid option or a purely offsite service, continues to be one of the more future-proof technology options available.

Data has a reputation

According to a study by the World Economic Forum, on average more than 25 percent of a company’s market value is directly attributable to its reputation. Given the direct impact reputation can have on an organization’s market value, that firms would take reasonable steps to ensure they are perceived in an accurate light is expected. Like other industries, financial services firms are investing in social media monitoring and analytics tools to keep watch over their public reputation. Often, what happens under the surface is what can influence how an organization is perceived more than an errant tweet or email.

Nearly every business and strategic decision a financial services organization makes is in some way reliant on assumptions that have been validated with risk data. Through the integrated efforts of a rigid utility function and an agile refinery function, organizations can centralize the management of this data in a way that best suits its nature, while delivering accurate, transparent and actionable results in a reliable time frame. Products output through the refinery can be used as end products in their own right, as inputs to specialized production processes or as intermediate stages in the transformation to a standard end product. These separate but integrated functions enable firms to gain an enterprise-wide view of risk with greater cost certainty, more consistent and reliable analytics and reconcilable results for regulatory purposes.

Individually, these benefits contribute to a holistic, enterprise-wide view of exposures that benefit the day-to-day running of the business. Together, they provide financial services organizations with another layer to protect and inform their reputation, at a time when it is more valuable and scrutinized by more stakeholders than ever.

Empower your bank to make risk-informed decisions and meet regulatory compliance.