Share VSAM data with new environments while reducing operating costs
A survey of end-users of Data Integration and Integrity (DII) software conducted by IDC in 2019 found that dynamic data movement, also known as data replication, is best served by stand-alone or platform tools, not custom code. When it comes to replication of data that is located in Virtual Storage Access Method (VSAM) environments, it’s likely that users would also prefer off the shelf software instead of custom code:
Challenges of sharing data from VSAM environments
Much mission critical data is managed, captured and stored in VSAM environments and this data must often be shared into new environments for analytics and integration projects. These projects include feeding a data lake, sharing data with cloud-based applications, detecting events in near real time for compliance or using this data for real time business insights. Such projects generally lead to improved ability of organizations to run their businesses and interact with their clients. For example, events originally generated on a transactional system can lead to a relevant text or email being sent to a client shortly after the event occurred. In a retail situation, such a text or email could be a follow-up contact about items related to their purchase – increasing the opportunity for cross-selling success, and therefore increasing revenue as well.
Sharing such data gives rise to a number of challenges. Foremost are the mainframe operating costs ( e.g. CPU consumption) of running “data capture” tools whether off the shelf or custom coded. Costs associated with staffing data integration projects with the right skills and expertise can also be a problem. That is, for some organizations, finding employees with appropriate System Z skills could be a challenge. The time associated with making projects successful can also be a challenge particularly if custom-code-based solutions are chosen as they require more time and the right skill set to implement. To compound these challenges, an increasing number of analytics and integration projects demand up-to-the-second, and easier and faster access to the enterprise data that resides in VSAM.
A simple solution
For users interested in a “custom code free” and easy to deploy solution for near real time replication in support of the aforementioned analytics or integration projects, IBM Data Replication is the answer. It is part of the suite of products that led IBM to be named a leader in the 2019 Gartner Magic Quadrant for Data Integration Tools. IBM Data replication provides a comprehensive solution for dynamic integration of z/OS and distributed data, via near-real time, incremental delivery of data captured from database logs to a broad spectrum of database and big data targets including Kafka and Hadoop. Read this solution brief to learn more.
Remote source capture engine
Moreover, IBM just introduced a new Data Replication VSAM for z/OS Remote Source capture engine that can be deployed remotely from the mainframe environment. Remote deployment of the capture helps reduce the operational cost of capturing VSAM data by offloading much of the data capture processing to a commodity environment (typically on LINUX) thereby helping customers reduce mainframe operating costs. Deployment and management of the software is now shifted to a non-mainframe environment which will be attractive to clients who want to reduce their reliance on specialized System Z skills. The VSAM for z/OS Remote source capture engine can integrate with all of the extensive array of “applies” that deliver data to many different target environments, providing clients with choices as they plan their integration and analytics projects.
Learn more about your options
With efficiency and cost savings more important than ever, the ability to overcome the challenges of sharing information in VSAM environments is vital. IBM offers a solution focused on delivering simplicity, low operating cost and performance through the addition of the new Data Replication VSAM for z/OS Remote Source capture engine into the IBM Data Replication portfolio.