2 Minute Video
CTOvision Special Report: Mainframe Offloading.
Software AG can store your commonly used datasets in memory, reducing MIPS and overall cost.
Many government agencies still oversee large environments consisting of mainframe hardware. This hardware is difficult and expensive to maintain, and it locks future development efforts into decades-old technologies such as JCL and COBOL. Engineers trained on these technologies are scarce and expensive.
Typically, organizations that rely on mainframes see a large portion of their overall IT budgets dedicated to keeping their mainframes running. This is in part due to the way mainframes are traditionally licensed: by the MIPS. This is essentially a measurement of how much the mainframe is used for processing. The more you use – the more you pay. The inverse is true and why our customers are able to reduce these costs by up to 80%!
Software AG has decades of experience in mainframe integration, paired with some of the most powerful and revolutionary software available. We combine these to provide real-time bi-directional integration with mainframes of all types. This allows our customers to unlock their mainframes, exposing data and business logic to open systems that run on much less expensive commodity hardware.
Using these powerful mainframe integration components as a foundation, our customers can then leverage the Terracotta in-memory data management technology to “move” data off of the mainframe. This directly and immediately reduces MIPS and therefore cost.
The first step toward reducing your mainframe costs is to establish integration to and from the mainframe. Software AG offers several software adapter alternatives, including a zero-footprint option that requires no change to your existing mainframe. We provide mainframe experts to help our customers choose the appropriate options given their requirements.
Next, you will work with our mainframe and data management experts to determine which datasets to move off of the mainframe. Software AG’s Terracotta technology allows you to put this data into ultra-fast system memory called RAM. Inexpensive commodity hardware can be used to provide terabytes of data in-memory to scale to even the largest datasets. Server arrays can be clustered to provide highly available distributed data stores.
By moving frequently used data into memory, our customers not only reduce mainframe license costs, but also establish a high-performance flexibility layer to securely expose agency data.