Legacy Modernization with InRule and AveriSource

DR

Dan Reynolds

10/03/2018

Legacy mai­­­nframe systems are the original pattern for ransomware: your business depends on this system and if you stop shoveling money into it, your business comes to a screeching halt. What other choice do you have? Paying exorbitant maintenance expenses just keep it running; making improvements to keep up with current business requirements is an entirely separate problem.

The wave of digital transformation is upon us, and when we are tied down to systems that leverage nearly 50-year old undocumented code, it’s hard to adapt. There is a dwindling workforce accessible to bend these systems into a form that will allow companies to compete with those built on modern technologies. With the maturation of cloud along with the self-service revolution, there has never been a better time to get off these systems.  The existing source code can be leveraged to enter the modern era of empowering business people to maintain the logic and expose this tech to the masses.

Ok, where do we start?  How are you going to ensure the existing functionality behaves as expected?  How can I position my company to not be in this situation again?

In my experience, mainframe systems always have a very simple top-level description, something like:

System X takes the orders and generates invoices which are then sent out to the customers; it also tracks the payments and sends out notices if a customer’s payment is late.” 

Seems easy enough; how hard can it be to port that to a new technology? The reality is that once you start peeling back the layers of the mainframe onion, it gets much more complicated. There are countless nuances that have far-reaching impact to the business. For example: If the third letter of the order status field is a “Q”, then this order is subject to delivery fees that differ by zip code, unless, of course, the order is a government order, which means we need to calculate shipping in an entirely different manner. You get the idea. Like most systems, System X has been complicated by decades of one-off enhancements to handle the ever-growing amount of exceptions to the logic. How can we ensure that we don’t have to rediscover and relearn the same things during the modernization effort?

Let’s take stock of what we have in our existing system. We have the documentation, which in my experience, is usually either incomplete or inaccurate. We have the code base and we know what the data looks like before and after it is processed by the system. It’s important that we understand that the ultimate cost of the project will be defined by how well we leverage these existing pieces.

Once we know what we have, we can evaluate our migration options. We can simply pick a new platform and start slingin’ code. This is the approach that most companies take, and it works, but it’s expensive and takes a long time.  One wrong assumption can invalidate months of work.  And the unfortunate reality is that we haven’t improved our situation that much because this approach leaves us with logic still buried in code.  Sure, if things are done well, we will have a new shiny system in a modern platform and a bigger talent pool of individuals we can hire to maintain it. The system may be easier to scale and it may be accessible to more people.

However, just like with System X, we still need developers to maintain the logic, which means we are still at the mercy of a software development lifecycle which can be slow and arduous. Not to mention that because these projects take time, it is very common to be making changes to the old system as the new system is built.  This complicates manual recoding efforts because now parallel changes in the new system are required throughout the project.

Another approach to this legacy modernization conundrum is to automate a process that harvests the hard-to-reach knowledge that we’ve accumulated over the years directly from the code. After the knowledge is harvested, we can swiftly convert it into a form that can easily be maintained by business people, making it more it accessible to the enterprise.

It’s worth noting that one of the biggest challenges with these projects is knowing that the logic works the same in the old and new system.  An automated process can keep step with ongoing changes, it provides traceability between systems, and existing workloads can be siphoned off for automated testing to ensure identical behavior between the them.

The teams at InRule and AveriSource have partnered to create a solution that takes this approach, and we’re pretty excited with the outcome. The solution is built on the rich expertise of both organizations and the result is much more than just converting COBOL code to business rules. The transparency created in the process helps the business respond to ever-growing industry demands to stay relevant. Add to that the ability to deploy the solution in a variety of ways on modern day tech stacks, your company will have the flexibility to help your business scale.

Here’s how it works:

Legacy Modernization Process

Once the logic is analyzed by AveriSource and migrated to InRule, you will better understand the logic and it will be in a form that can be quickly and easily maintained by the business. It can also be deployed in a variety of ways (e.g. on prem or in the cloud; in-process or as a service; in .NET or JavaScript) to give your company the flexibility to scale your business.

For more information on legacy modernization, see https://www.inrule.com/solutions/legacy-modernization/. For deployment options, see https://www.inrule.com/deploy/

Last but not least, feel free to email me directly if you have questions at dreynolds@inrule.com.