Bitrock logo

How Mainframe Offloading can help modernize legacy systems and save costs

mainframes

Today, mainframes are still widely used in data-centric industries such as Banking, Finance and Insurances. 92 of the world’s top 100 banks rely on these legacy technologies, and it is believed that they account for 90% of all global credit card transactions (source: Skillsoft).

This is suboptimal, since relying on mainframes generates high operational costs, calculated in MIPS (million instructions per second). A large institution can spend more than $16 million per year, estimating the cost for a 15.200 MIPS mainframe (source: Amazon Web Services).

In addition, mainframes come with technical complexities, like the reliance on the 60-year old COBOL programming language. For organizations, this means not only reduced data accessibility and infrastructure scalability, but also the problem of finding skilled COBOL programmers at a reasonable cost – more info here

Moreover, as consumers are now used to sophisticated on-demand digital services  – we could call it the “Netflix effect”, by which everything must be available immediately and everywhere. Thus banking services, such as trading, home banking, and financial reports need to keep the pace and offer reliability and high performances. In order to do that, large volumes of data must be quickly accessed and processed from web and mobile applications: mainframes may not be the answer. 

Mainframe Offloading to the rescue

Mainframe Offloading can solve the conundrum. It entails replicating the mainframe data to a parallel database, possibly open source, that can be accessed in a more agile way saving expensive MIPS. As a sort of “Digital Twin” to the mainframe, the replicated data store can be used for data analysis, applications, cloud services and more. 

This form of database replication provides significant advantages both in flexibility and cost reduction. Whenever an application or a service needs to read customers’ data, it can access the parallel database without having to pay for expensive mainframe MIPS. Moreover, the mere offloading paves the way for a progressive migration to the cloud, e.g. entailing bidirectional replication of information between the open source cloud database and the datacenter.

Offloading data from the mainframe requires middleware tools for migration and integration. Apache Kafka can be leveraged as a reliable solution for event streaming and data storage, thanks to its distributed and replicated log capabilities. It can integrate different data sources into a scalable architecture with loosely coupled components. 

Alongside the event streaming platform, CDC (Change Data Capture) tools are also to be considered to push data modifications from the mainframe into the streaming platform. CDC is a software process that automatically identifies and tracks updates in a database. It allows to overcome the limitations of batch data processing in favour of a near-real time transfer. While IBM and Oracle offer proprietary CDC tools, such as InfoSphere Data Replication and Oracle Golden Gate,  3rd party and open-source solutions are also available, like Qlik Data Integration (formerly known as Attunity) and Debezium

From Offloading to Replacement

As a heuristic process for perfectibility, Mainframe Offloading can also be seen as a starting point to mainframe replacement proper, with both applications and mission-critical core banking systems running in the cloud. This would mean that the expensive monolithic architecture gives way to modernization and future-proof, cloud native solutions.

Yet, replacing a mainframe is not an easy nor a quick task. In his blog article “Mainframe Integration, Offloading and Replacement with Apache Kafka”, Kai Waehner hypothesizes a gradual 5-year plan. First, Kafka is used for decoupling between the mainframe and the already-existing applications. Then, new cloud-based applications and microservices are built and integrated in the infrastructure. Finally, some or even all mainframe applications and mission-critical functionalities are replaced with modern technology.

It must be said that it is often not possible to switch off mainframes altogether. For larger institutions, such as major banks, the costs and inconveniences of a full migration may be just too high. Realistically speaking, the most effective scenario would be a hybrid infrastructure in which certain core banking functionalities remain tied to the mainframe, and others are migrated to a multi-cloud infrastructure.

How Bitrock can help

Given the complexity of the operation, it is fundamental to work with a specialized partner with thorough expertise with offloading and legacy migration. In Bitrock we have worked along with major organizations to help them modernize the infrastructure, save costs and support their cloud native transition. By way of example, we have carried out a mainframe offloading project for an important consumer credit company, transferring data from a legacy DB2 to a newer Elastic database. Thanks to the Confluent platform and a CDC system, data are now intercepted and pushed in real time from the core system to the front-end database, enabling advanced use cases

If you want to know more about this success story or how we can help you with your journey from legacy to cloud, please do not hesitate to contact us!

Skip to content