Blogs

Face It, Big Data Is the New Normal

As information evolves to be so broadly available, a big data strategy is no longer the question

Without a doubt, data is king. Big data now predominates as a major topic of conversation in business and IT. It evokes advanced paradigms, fresh insight, and new ways of using data. But is data now at a level that elevates it to even greater heights?

Data has been a driving force in the business world for quite some time, forming the basis of decisions made by business leaders to provide insight such as customer needs, sales profitability, and operational costs. If the right data was collected and used correctly, it provided tremendous value to organizations. This practice even caused entire industries to blossom in data warehousing and business intelligence (BI) in the 1990s that continue to thrive today. But now, the shift toward having the capability to handle large volumes of data created at incredible velocity and in a variety of formats has created an environment that has traditional technology tools struggling to keep up.

The reality is that data has not only become pervasive, but it is intrinsic in the fabric of our everyday business and lifestyles. Consider the simple act of going out for a jog to get some exercise. Smart devices and instrumentation can record the speed and distance the person traveled, the path of the jog, when the jog was completed, and the number of calories the person burned while jogging. And these measures can be published for all to see through social media networks and channels.

In the context of business, a company can know when and for how long a person has visited its website, which products and services were reviewed, and what the customer experience was like. It can even determine which products may have been dropped from the shopping cart. And armed with that information, the organization can engage the customer in a conversation using various interaction channels to offer a discount.

These examples and many others indicate that a truly data-based economy is at work. The reach of big data is no longer constrained to only people with data management and development roles. Today, business and IT leaders in all sectors have to confront both the implications and opportunities big data offers. They are driven by the need to stay ahead of the competition by differentiating themselves in a saturated market and being smarter than the competition through operational efficiencies.

 

Exploiting widely available data

To achieve these objectives, organizations should take advantage of the opportunities data affords them through its broad availability. It is widely available from traditional, internal corporate systems and customer conversations from call center recordings to external information such as social media, weather, traffic, and network data. Today’s digital age allows organizations to exploit new kinds of data for creating additional insight and developing compelling business models. Broadly speaking, there are three paradigms for using big data to create value: access to data volumes of increasing size, data that is structured differently, and data frequency.

By having access to so much more data, organizations can create added context attached to the data and models they possess. A lot of data repositories that had little or no value before the era of big data can now be unlocked by linking disparate data sets in the same way pieces of a jigsaw puzzle can be interlocked.

A significant proportion of today’s information is unstructured information stored in the form of documents, images, audio, tweet streams, and more. By analyzing and mining unstructured content, organizations can traverse new information boundaries that were previously locked away in file servers, document management systems, and call transcripts stored inside their own firewall, as well as external data available from the web.

The value of data to any organization can vary depending on how old the data is. Knowing that a customer has deposited a large sum of money in a bank account early in the day, for example, allows a bank the opportunity to analyze its propensity to cross-sell a high-interest investment product. The bank can determine how to tailor the offering before the customer spends the deposited amount or moves those funds to another financial institution. With so much data being generated at Internet scale and speed, filtering the data pool to separate the useful data from the noise means decisions can be made quickly and with appropriate timing.

 

Taking the information superset approach

Fortunately, technology has caught up with the new characteristics of volume, velocity, and variety of data. Some technology and certain practices such as change data capture (CDC) have been around for a while, and they used to allow near-real-time data to be scraped off database logs for intraday reporting. Now, stream-based computing takes these technologies a step further by enabling organizations to collect and ingest data in its natural form in real time and apply continuous filters and queries to obtain analytics in real time.

By keeping and using all data—instead of harnessing known value and sampling small data sets to build predictive models—many organizations are shifting toward thinking of data as an information superset. And also taking place is a movement toward rapid, ad hoc analytics with the notion of fail fast being acceptable for hypothesis testing. High-end analytic engines allow complex data processing to enable business leaders to make decisions faster than ever. Plus, Apache Hadoop technology is augmenting the traditional data warehouse and bringing analytics close to the data by helping avoid the need to move, transform, and model data before provisioning it to information-consuming tools.

Analytics is no longer just about structured data. Text and content analytics software are now broadly available and used to understand customer experience, sentiment, and behavior to enhance profiling and servicing customers.

With these capabilities, businesses no longer need to question what their big data strategy should be. Instead, organizations can just act—that is, glean insight from the information at hand. They can consider information to be broadly available for making decisions that help increase profits, enhance their customer focus, and improve efficiency.

Consider the term e-business, which was coined by IBM in 1996. E-business is more than just e-commerce. Today, companies can apply information and communications technology in corporate activities across the value chain, whether they are conducted internally on an intranet, an extranet, the Internet, or the web.

In the same way e-business has become essential to the way in which many organizations operate, big data is expected to eventually become, well, just data. Big data is not revolutionary; it is evolutionary in how it facilitates wider access to expanded data volumes and how they are used. And big data goes well beyond just Hadoop. It is an approach for extracting insight that becomes second nature to organizations. The key is to start with the question, and then go to the data.

Please share any thoughts or questions in the comments.