The 2017 Data Governance Winter Conference on 4 - 8 December covered topics ranging from how to start a data governance program to attaining data governance maturity to how to improve your organization’s information quality. After attending, Brian Mayer, business ready data practitioner, and Mark
The data lake may be all about Apache Hadoop, but integrating operational data can be a challenge. Learn how to deliver real-time feeds of transactional data from mainframes and distributed environments directly into Hadoop clusters and make constantly changing data more available.
The new Gartner Magic Quadrant (MQ) for Master Data Management has been published, and what you might not notice at first glance is that this year, IBM chose not to participate. Gartner still included IBM in the MQ. However, we did decline to engage in the process and provide detailed data for
What is driving change in the world of data? In his keynote from the Big Data Summit KC 2017, our Making Data Simple podcast host and IBM Analytics VP Al Martin addresses disruption, the data maturity model and the five areas business must get right to succeed in the era of cognitive computing.
In most modern-day organizations, external macros tend to be very influential — leaving little bandwidth for optimizing data governance. And, as stricter data storage and security compliance regulations come into play, it's becoming more and more critical for organizations to ensure they have the
In today’s energy industry, one of the key priorities is finding new ways to cost efficiently keep up with insatiable demands for power, while also delivering renewable energy. You must be able to predict when events will occur and make the first move. Being first to respond to customer or market
Typically, ingesting streaming event data, persisting with low latency and analyzing it along with historical event data requires integrating multiple analytic systems. IBM Db2 EventStore is purpose built to simplify the complexity of harnessing event data with a single system. Its unique
Data already is the new currency and is at the heart of everything digital. I like to repeat the adage, “Data becomes Information, becomes Knowledge, becomes Wisdom”. And “It’s all about the data”. So why do we send up probes, sensors or satellites — for the data?
Universal connectivity is fueling streams of event data from a variety of event sources. Increasingly, organizations are developing and deploying event driven applications to harness the growing volumes of event data. IBM Db2 EventStore offers a scalable integrated system for enterprises to ingest
In the connected world of today’s digital economy, apps, IoT devices, vehicles, appliances and servers are generating endless stream of event data. The stream of events describes what is happening over time and offers the opportunity to track and analyze things as they happen.
It seems that we’re reaching the point where the Internet of Things (IoT) is moving from the domain of enthusiastic early-adopters to the more challenging, more profitable territory of mainstream enterprise technology. Event-driven architectures are playing a key role in these types of applications
If you read a lot of development blogs nowadays, you’ll probably notice a common theme: developers don’t want to deal with databases. They want to focus on designing, building, testing, and deploying applications that deliver value to the business as quickly as possible.
Big data isn’t just getting bigger. It’s getting more valuable. As companies work to unlock more value from their data, one of the biggest challenges to address is disconnected data silos. Big companies don’t have one data lake, they have data lakes, ponds and pools.
Recently, I had the honor of speaking with a number of the world’s most influential thought-leaders in the fields of data science, data analytics, machine learning and digital transformation. This group of prominent data technologists was more than happy to answer a wide variety of question on