The cost of maintaining legacy systems hit home in the early weeks of the COVID-19 pandemic. Governors in New Jersey, Kansas, and Connecticut sounded the alarm about how their states were struggling to process an unprecedented number of unemployment claims. These states all faced the same problem: the computer programs that had been put in place in the 1970s were choking on the volume of applications, and they couldn’t find enough people with COBOL programming skills to update them.

While a pandemic is an unusual circumstance, IT executives in financial services, manufacturing, airlines, and other mature industries can sympathize with the governors’ concerns about the cost of maintaining legacy systems. Aging software is also a growing liability for businesses that are under siege from a new crop of born-on-the-web competitors. So why do many organizations put off the task of legacy data modernization?

One reason is fear of the unknown. Many legacy applications are poorly documented and were developed by programmers who left the company years ago. No one knows how they work or what interdependencies exist between them and other programs. Changing these systems could create a cascade effect that leaves the organization in a worse position than before.

Another reason why people put off modernizing legacy systems is the human inclination to leave well enough alone. If software is doing its job, why spend the money and time to replace it? And in a time when budgets are tight, organizations are reluctant to fund new initiatives to update programs that they don’t believe would add much value to the business. Still, it is important to make these critical updates for several reasons.

Maintenance burden

The cost of maintaining legacy systems can hinder innovation over time. A 2018 Deloitte survey found that the average enterprise spends 57 percent of its IT budget on supporting business operations and only 16 percent on boosting innovation. In the meantime, finding people with the skills to support software written in outdated languages like COBOL is becoming more difficult and expensive with every passing year.

The average enterprise spends 57 percent of its IT budget on supporting business operations and only 16 percent on boosting innovation.

While other older languages — such as Perl and natural or assembly language — may still be propping up aging systems, COBOL, a language that dates back to the 1950s, is a major hurdle in legacy modernization. A 2017 Reuters report estimated that there were 220 billion lines of COBOL still in use, and a good portion of that code is embedded in state agencies across the US. To put this issue into perspective, the average COBOL programmer is between 45 and 55 years old.

In addition, Forrester reported in 2018 that businesses had lost 23 percent of their mainframe workforce over the previous five years, and 63 percent of those jobs would go unfilled for the foreseeable future. The dwindling labor pool even prompted IBM to launch a free online course in COBOL last year.

The cost of maintaining legacy systems takes a toll in other ways as well. For example:

  • It limits the ability of organizations to extend and enhance their customer-facing applications with mobile and self-service interfaces.
  • Changes often can’t be accommodated easily in business models that use legacy systems. But many changes, such as the shift toward subscription pricing that’s occurring across numerous industries, may be necessary to remain profitable in the current business environment.
  • Limitations in hardware capacity, memory, and storage can cause older applications to become backed up under growing transaction volumes.

The obvious solution to the cost of maintaining legacy systems is to modernize, but cost concerns can hold CIOs back from tackling big modernization projects. A recent survey by software integration provider Dell Boomi found that the principal reason why enterprise resource modernization projects fail is because the organization fails to accurately estimate costs.

However, maintaining legacy platforms also exacts costs that grow over time. Finding people who can program in old code gets more difficult and expensive each year, and today’s top developers have little interest in working on antiquated platforms.

Old workflows encoded in software can’t easily be streamlined, which drags down productivity. “One of the most important signs that a business’ technology is doing more to harm than help is if productivity starts to flatline,” said Tirena Dingeldein of the online peer review software site Capterra in an interview with CIO.com.

Beyond cost

Legacy data modernization is about more than dollars and cents alone. Organizations need to evaluate the business risks of maintaining platforms that can’t easily adjust to changes in the business landscape. They must consider the competitive implications of being unable to integrate new technologies like streaming data, distributed ledger, or AI. And if legacy software breaks down under a heavy load, how might the business be affected?

The good news is that there are more modernization options than ever today, and their costs have come down. For example, enterprise complexity analysis tools can identify interdependencies in old code and reduce the risk of collateral damage. Another worthy consideration is refactoring, a rapidly evolving technique for altering the internal structure of software without changing its external behavior so that it continues to run even while it’s being changed.

Many legacy applications can be digitalized or wrapped in software containers to gain portability and scalability benefits. There are also many available options for outfitting old programs with application program interfaces that expose services and data that can be integrated into more modern, cloud-based alternatives.

Perhaps the biggest question organizations should ask themselves about modernization is whether they can afford not to do it. In markets brimming with cloud-native competition, that answer may be self-evident.