Blogs

6 Keys to Real-Time Analytics

Are you getting all you can out of your analytics initiatives?

The need for information in the 21st century continues to intensify—and shows no sign of abating. Today’s decision makers need to make sense of a tremendous volume and variety of information, leading more enterprises to deploy analytics that not only help them sense and respond to key business issues, but also help them make predictions and act based on real-time data.

Business analytics derive their value through the ability to extract specific and changing data from a wide variety of heterogeneous sources for smarter decision-making. Potential sources extend far beyond the classic IT portfolio of enterprise resource planning (ERP) transaction systems, databases, and data warehouses to include information from external sources, such as customer surveys, market research, and buyer behavior trends. Analytics applications transform this information in real time (or near real time) to deliver fresh insights.

For example, business analytics can help organizations monitor blood supplies, report on carbon footprints, or increase visibility across their supply chains. Retail outlets can determine the best end-cap product positioning based on customer preferences for Coke or Pepsi. Police services are using analytics to put crime information into the hands of patrol officers so they can quickly identify problems and associate trends and locations of crimes.

How can data management professionals ensure that their performance management and analytics initiatives are set up for success? Here are six best practices that can help you overcome the twin challenges of increasing user demands and more complex data sourcing requirements.

IBM Smart Analytics System: The combo platter

Released in September, the IBM Smart Analytics System is designed to minimize the time, expense, and technical skills requirements that hinder broad adoption of advanced analytics systems. Organizations of all sizes can use the platform to rapidly deploy and operate advanced analytics for solving complex business problems.

The system combines an analytics platform, trusted information platform, and system platform. The analytics platform provides cubing services, data mining, text analytics, intuitive business intelligence (BI) reporting, analysis, dashboards, and scorecards. The trusted information platform offers high-performance data warehouse management and storage optimization. The system platform provides scalable server and storage resources. The Smart Analytics System also includes installation services and a single point of support.

The new system requires a small amount of storage to do its work, saving both floor space and energy. It is designed to uncover insights and hidden relationships among massive amounts of data—not just structured information found in databases, but unstructured and incompatible data from such diverse sources as videos, e-mail, Web sites, podcasts, blogs, wikis, archival data, and more.

These intelligent data mining features, combined with speedy analytics and other business-critical capabilities, help make the Smart Analytics System a powerful warehouse-based option for developers facing increased demands for faster and more accurate information access.

1. Cast a wide net

Making decisions and developing processes based on only part of the picture can negatively impact business performance. The first step, therefore, is to make sure that your analytics implementation has direct access to all relevant, available data no matter where it resides. The analytics system should also serve as the authoritative source for all historical and transactional data, so you can properly glean insights on trends and make decisions that will impact future performance. One-off dashboards, custom developed programs, or stand-alone spreadsheets that don’t connect back to the trusted pool of data are generally not reliable, sustainable, or scalable. Each solution adds its own layer of query and reporting complexity and introduces associated reconciliation and usability challenges.

Analytics solutions need a rich variety of information to yield meaningful insights. With so much data fragmented across any number of systems, you need a broad reach to ensure you can connect to any and all transactional systems, warehouses (relational and online analytical processing [OLAP]), flat files, and legacy systems, as well as XML, Java Database Connectivity (JDBC), Lightweight Directory Access Protocol (LDAP), and Web Services Description Language (WSDL) sources.

Casting a wide net helps you break down the data silos that hamper analysis and allows you to deliver a timely and complete enterprise view of relevant information. Plus, when new data sources become accessible, all analytics capabilities can access that data immediately.

2. Plan a caching strategy

Performance optimization is a critical part of fast reporting and interactive analysis. Switching between different backend systems to access data is a familiar requirement, but it can seriously hinder performance if done on the fly.

Instead, create a caching strategy to both improve system performance and minimize any negative impact on the performance of source systems caused by repeated requests for data. Common techniques include enterprise information integration (EII); virtual caching; OLAP caching; caching to disk or local database; event-driven, scheduled, and manual refreshes; and advanced hybrid memory/disk utilization options.

3. Adopt a common, multilingual business model

Once the IT team has accessed and integrated the data needed to provide a complete view of the organization, modelers must convert it into information that is meaningful to business users. They must also ensure that the right information reaches the right users at the right time and is delivered in the right way.

The key to delivering this information in terms that business users understand is a common metadata business model that applies consistent business rules, dimensions, and calculations to all data regardless of its source. This makes it easier for a business to accurately report and analyze information such as sales invoices, general ledger charges, and order receipts.

A common business model provides the single view of the organization necessary for reliable, cross-enterprise reporting for all roles, locations, and languages. This approach not only supports a level of information consistency that leads to confident decisions, but reduces the cost of maintaining the modeling environment. It also reduces report proliferation by allowing a single report to be produced for all geographies.

4. Model once, package for many

Large data warehouses can overwhelm those trying to produce reports and analyses because there are simply too many data objects to choose from. Instead, build one model and publish sections of it that address the needs of different business users or communities. Whenever possible, create reusable objects and build multi-tier models that separate physical models from business models. This will decrease the downstream effect of changes and enable you to evolve your models more easily, as well as add or change data sources and sourcing strategies.

By publishing sections of a single common business model, you avoid the pitfalls of duplication and divergence. This strategy helps decrease model proliferation, supports consistency across the enterprise, and reduces the time required to deliver different models for different user groups—and it ensures that each user community receives only the specific information it requires.

5. Establish role-based security

Similarly, just because you have a single common business model fueling your analytics engine doesn’t mean you want every user to see every analysis or report. Assign role-based user access to avoid the pain and expense of generating separate models or reports. The single model also restricts authorized users to only their view of the data, which may also help you comply with data governance and privacy regulations.

6. Develop models collaboratively

It’s not easy to quickly build, deploy, and maintain an effective model, so organizations typically employ teams of data modelers to accomplish this task. To maximize productivity, craft processes and deploy tools that enable modeling teams to work collaboratively. For example, data modelers will need the ability to work on different parts of the model simultaneously, without jeopardizing one another’s changes or creating “downstream ripples” before aggregating the segments into a single view.

Resources