How Upgrading Legacy Systems Helps You Compete in the Cloud Age
The reasons for upgrading legacy systems increase with the launch of every new cloud startup. Many leaders at established businesses look on enviously these days as cloud-native competitors dazzle investors and customers with software that’s fast, scalable, and evolving by the day. How can a business saddled with 20-year old legacy applications hope to compete?
Fortunately, by taking advantage of a growing portfolio of application modernization tools and tactics for upgrading legacy systems, businesses can usher legacy applications into the cloud age via modernization. Even very old applications can now be redeployed on cloud platforms and outfitted with modern innovations like application program interfaces (APIs) and microservices. Additionally, organizations don’t have to rewrite their code from the ground up to gain the benefits of upgrading legacy systems.
Enterprise application modernization is getting a lot of attention from IT organizations right now. Research indicates that the global market will more than double to $24.8 billion by 2025, growing 17 percent annually in the process. However, upgrading legacy systems isn’t a point-in-time project. It’s best done over a long stretch following a thorough analysis of the organization’s entire application portfolio.
How to determine a legacy modernization strategy
The analysis stage is important because legacy applications often have interdependencies that aren’t evident on the surface. You should never modernize by starting with the oldest programs and working forward. Instead, start with the programs that can stand on their own and won’t inadvertently take down others if they’re changed. An emerging category of tools called enterprise complexity analysis can help by digging into the code and unearthing those dependencies. While the tools are expensive, they are often well worth it for companies with large legacy portfolios.
Upgrading legacy systems also isn’t an either/or proposition. Gartner defines seven types of modernization, and the most basic is to encapsulate data and functions to make them available as services via an API, a process that doesn’t require touching the code at all. More involved options include:
- redeploying the application to cloud infrastructure unchanged (“lifting and shifting”)
- migrating it to a new runtime platform (such as a software container)
- restructuring its existing code to optimize it
- replacing or rebuilding it from scratch
Even the most basic approaches can enable organizations to benefit from greater scalability and access new functions in the cloud.
5 candidates for legacy modernization
There is no one-size-fits-all strategy for upgrading legacy systems, and specific considerations will apply based on the type of software being modernized. Here are five of the most common candidates for modernization and some suggested strategies to support the updates.
1. Antiquated systems
Antiquated systems are typically those that the organization built from scratch back in the days of mainframes and COBOL. These are the most challenging applications to modernize, and doing so may not even be a good idea.
If the system works well and isn’t a liability to the organization, it’s often best to leave it alone. Instead, it may be useful to gather data and functions to be exposed as APIs while you look for more modern packaged alternatives.
2. Applications on commodity servers
X86-based hardware running Linux, Windows, or Unix should be virtualized, as it can permit your organization to scale resources as needed and optimize the use of available hardware. Virtualization has been shown to increase server utilization rates in typical data center settings from less than 20 percent to more than 70 percent. Virtualized applications can now also be shifted to the cloud easily, as all major public cloud providers support the most popular virtualization platforms.
An even better alternative is to encapsulate applications in software containers such as Docker. Containers wrap applications and their dependent components like system libraries and settings together in a single package that can be shared and stored in a repository. They’re highly portable and can be moved easily between on-premises infrastructure and all major clouds.
In both the virtualization and container scenarios, the application functionality isn’t disturbed, but the organization gains much greater flexibility in terms of the environment and allocated resources.
3. Business processes
These aren’t necessarily software, but they’re often encoded in applications that involve workflow automation and document management. This can be a liability if the processes themselves were never optimized, the business has changed, or technology has made it possible to improve them. Business process analysis is a methodology that helps organizations examine existing processes to find new efficiencies. Many tools are available to automate this work, and users can visualize business flows using drag-and-drop tools or receive automated advice on how to improve them.
Robotic process automation (RPA) is part of a rapidly growing category of tools Gartner calls “hyperautomation,” which works to automate routine human tasks, such as keying in data from printed documents. Other forms of hyperautomation include:
- low-code application development
- event brokers
- document capture software
- process discovery
- predictive decision modeling
All are useful in streamlining and automating processes.
4. API integration
This integration exposes application services and data in ways that can be consumed by other software. For example, the credit card approval process or resume scanning function within an existing application can be encapsulated and exposed as an API. Many tools are available to retrofit aging applications with APIs, enabling them to be modernized gradually over time. This is a relatively easy way to extend the life of old software while you write or install something more modern.
5. Enterprise service bus (ESB)
An ESB orchestrates the exchange of services between applications. It allows applications to subscribe to messages based on simple rules and provides a standard way for services to be distributed. ESBs have been around for a long time, but the concept has been rejuvenated as cloud computing has changed the way software is built.
Modern cloud applications consist of loosely coupled software functions that are assembled on the fly. An ESB enables developers to enhance and extend functionality by plugging in new services. For example, they can build a new user interface using cloud services and connect to a legacy back-end system for processing via APIs.
Packaged for the future
Client-server software was all the rage in the 1990s, but it has been surpassed by software-as-a-service (SaaS) delivered through the cloud. This is a problem for companies that have invested significant resources into building applications based on these legacy platforms, particularly since many, such as Lotus Notes, used a proprietary file system and scripting language.
The good news is that the developers of many of these applications have done the hard work of modernizing them for the cloud and provided backward integration. In markets where there has been considerable consolidation, such as enterprise resource planning and customer relationship management, most acquiring companies have worked hard to provide customers with a migration path for these older applications.
Before shifting away from client-server software, it’s good to investigate whether better functionality can be found in a comparable cloud-native application. If the legacy data is in an SQL-compliant database, then data migration shouldn’t be a problem. If the client-server application was heavily customized, or if proprietary development tools were used, it’s best to seek the help of a specialty integrator or take the hit and rebuild those functions from scratch.
Heavily modified packaged applications are a stickier problem. Extensive customizations can block companies from upgrading to new versions of the same software, which can freeze them in time. Since most major applications are now available in the cloud, it may be possible to address this problem by recoding previous modifications using APIs to separate them from the core application. By doing so, businesses can benefit from enhancements to the base package and maintain their customizations.
The Agile development technique called DevOps is the overwhelming favorite option for building applications in the cloud. That doesn’t mean legacy methodologies like the waterfall model don’t have value, but top-down structure and rigid processes don’t lend themselves to rapid functional evolution. A modernization initiative is a good opportunity to adopt agile techniques to complement or replace traditional methodologies. DevOps encourages close collaboration between developers and business users, rapid iteration with daily code releases, continuous feedback, and extensive use of services. Developers provision their own infrastructure and frequently deliver software pre-wrapped in containers for portability.
Innovation shouldn’t be shackled by legacy chains. Enterprise application modernization options are proliferating, and costs are coming down. There has never been a better time to revisit your organization’s software portfolio with the goal of bringing it into the cloud computing age.