Context-aware stream computing helps you become more responsive to emerging opportunities. By using innovative technologies to understand the context of data and analyze data in real time, you can put data to work.
Dwaine Snow is a Global Big Data and Data Science Technical Sales Manager at IBM. He has worked for IBM for more than 20 years, focusing on relational databases, data warehousing, and the new world of big data analytics. He has written eight books and numerous articles on database management, and
Although NoSQL database technology has been around for a long time (before SQL actually), not until the advent of Web 2.0, when companies such as Google and Amazon began using the technology, did NoSQL’s popularity really take off. Market Research Media forecasts NoSQL Market to be $3.4 Billion by
IBM Analytics VP of Marketing Jeff Spicer sits down with Data Scientist and evangelist Dez Blanchfield to recap IBM InterConnect 2017 and give his insights into a few of the announcements from this year's event.
Building a data lake is one of the stepping stones towards data monetization use cases and many other advance revenue generating and competitive edge use cases. What are the building blocks of a “cognitive trusted data lake” enabled by machine learning and data science?
Data science is a team sport that involves specialists with complementary skills and aptitudes. Successful data science initiatives leverage high-performance team collaboration. Like the fictional sleuth and his partner, IBM’s customers in the data science community must have the right mix of
Quite often, we see that the need for data security and governance makes some organizations hesitant about migrating to the cloud. This is perfectly understandable given the types of data gathered and used by businesses today, the regulations they must adhere to on both a local and global level,
With the Geospatial Analytics service in IBM Bluemix, you can monitor moving devices from the Internet of Things. The service tracks device locations in real time with respect to one or more geographic regions. Geospatial Analytics can be used as a building block in applications that support
This white paper discusses the advantages of using the PySpark API, which enables the use of Python to interact with the Spark programming model. It starts with a basic description of Spark and then describes PySpark, its benefits, and when it is appropriate to use instead of "pandas" open source
This is the second in a series of blogs on analytics and the cloud. We will consider the rise of the Internet of Things (IoT), analytics used on that data and how the cloud can be utilized to drive value out of instrumenting a very wide range of ‘things’.
There is a growing need for versatile, hybrid architectures that can combine the best of both data warehousing and big data analytics. The cloud is the perfect solution, because it makes it easier to build a robust data warehouse as a central “hub”, and then add other environments that can be