November 24, 2023 By Balakrishnan Sreenivasan
Abhay K Patra
9 min read

Application modernization is the process of updating legacy applications leveraging modern technologies, enhance performance, help make it adaptable to evolving business speeds by infusing cloud native principles like DevOps, infrastructure-as-code and so on.

Treatment of legacy applications could range from complete re-write to re-host based on the value, criticality and objectives. It is also a known fact that benefits are highest for re-write as it gives opportunity to get to true cloud native model with high degree of agility and speed. Many CIOs and CTOs are hesitant to invest due to the cost and timelines involved in realizing value while being able to balance between high investment re-write initiatives vis-à-vis low-value re-host approaches. Service providers and tooling vendors are trying to address this space via building accelerators that could be customized for enterprise consumption that helps accelerate specific areas of modernization: Evolvware, IBM Consulting Cloud Accelerators and cloud service provider specific tools.

While attempting to drive acceleration and optimize cost of modernization, generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. This article focuses on generative AI possibilities in application modernization process.

Application modernization overview

Application modernization is the process of updating legacy applications leveraging modern technologies, enhance performance, help make it adaptable to evolving business speeds by infusing cloud native principles like DevOps, infrastructure-as-code and so on. Application modernization starts with assessment of current legacy applications, data and infrastructure and applying the right modernization strategy (re-host, replatform, refactor or rebuild) to achieve the desired result. While Rebuild results in maximum benefit, there is a need for high degree of investment whereas re-host is about moving applications and data as such to cloud without any optimization and this requires less investments while value is low. Modernized applications are deployed, monitored and maintained, with ongoing iterations to keep pace with technology and business advancements. Typical benefits realized would range from increased agility, cost-effectiveness and competitiveness, while challenges include complexity and resource demands. Many enterprises are realizing that moving to cloud is not giving them the desired value nor agility and speed beyond basic platform-level automation. The real problem lies in how the IT is organized, which reflects in how their current applications/services are built and managed (see Conway’s law). This, in turn, leads to the following challenges:

  • Duplicative or overlapping capabilities offered by multiple IT systems and components creates sticky dependencies and proliferations which impacts productivity and speed to market.
  • Duplicative capabilities across applications and channels give rise to duplicative IT resources (like skills and infrastructure)
  • Duplicative capabilities (including data) resulting in duplication of business rules, for example, give rise to inconsistent customer experience.
  • Lack of alignment of IT capabilities to business capabilities impacts time to market & business-IT. In addition, enterprises end up building several band-aids and architectural layers to support new business initiatives and innovations.
  • Legacy technologies and monolith nature impacting speed and agility and in addition impacting security and compliance posture.

Hence, application modernization initiatives need to be focusing more on the value to business and this involves significant element of transformation of the applications to business capabilities aligned components & services. Biggest challenge with this is the amount of investment needed and many CIOs and CTOs are hesitant to invest due to the cost and timelines involved in realizing value. Many are addressing this via building accelerators that could be customized for enterprise consumption that helps accelerate specific areas of modernization and one such example from IBM is IBM Consulting Cloud Accelerators. While attempting to drive acceleration and optimize cost of modernization, Generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. We will explore key areas of acceleration with an example in this article.

A simplified lifecycle of application modernization programs (not meant to be exhaustive) is depicted below. Discovery focuses on understanding legacy application, infrastructure, data, interaction between applications, services and data and other aspects like security. Planning breaks down the complex portfolio of applications into iterations to be modernized to establish an iterative roadmap—and establishing an execution plan to implement the roadmap.

Blueprint/Design phase activities change based on the modernization strategy (from decomposing application and leveraging domain-driven design or establish target architecture based on new technology to build executable designs). Subsequent phases are build and test and deploy to production. Let us explore the generative AI possibilities across these lifecycle areas.

Discovery and design

The ability to understand legacy applications with minimal SME (subject matter expert) involvement is a critical acceleration point. This is because, in general, SMEs are busy with systems lights-on initiatives, while their knowledge could be limited based on how long they have been supporting the systems. Collectively, discovery and design is where significant time is spent during modernization, whereas development is much easier once the team has decoded the legacy application functionality, integration aspects, logic and data complexity.

Modernization teams perform their code analysis and go through several documents (mostly dated); this is where their reliance on code analysis tools becomes important. Further, for re-write initiatives, one needs to map functional capabilities to legacy application context so as to perform effective domain-driven design/decomposition exercises. Generative AI becomes very handy here through its ability to correlate domain/functional capabilities to code and data and establish business capabilities view and connected application code and data—of course the models need to be tuned/contextualized for a given enterprise domain model or functional capability map. Generative AI-assisted API mapping called out in this paper is a mini exemplar of this. While the above is for application decomposition/design, event-storming needs process maps and this is where generative AI assists in contextualizing and mapping extracts from process mining tools. Generative AI also helps generate use cases based on code insights and functional mapping. Overall, generative AI helps de-risk modernization programs via ensuring adequate visibility to legacy applications as well as dependencies.

Generative AI also helps generate target design for specific cloud service provider framework through tuning the models based on a set of standardized patterns (ingress/egress, application services, data services, composite patterns, etc.). Likewise, there are several other generative AI use cases that include generating of target technology framework-specific code patterns for security controls. Generative AI helps to generate detail design specifications, for example, user stories, User Experience Wire Frames, API Specifications (for example, Swagger files), component relationship diagram and component interaction diagrams.

Planning

One of the difficult tasks of a modernization program is to be able to establish a macro roadmap while balancing parallel efforts versus sequential dependencies and identifying co-existence scenarios to be addressed. While this is normally done as a one-time task—continuous realignment through Program Increments (PIs)—planning exercises incorporating execution level inputs is far more difficult. Generative AI comes in handy to be able to generate roadmaps based on historical data (applications to domain area maps, effort and complexity factors and dependency patterns and more), applying this to applications in the scope of a modernization program—for a given industry or domain.

The only way to address this is to make it consumable via a suite of assets and accelerators that can address enterprise complexity. This is where generative AI plays a significant role in correlating application portfolio details with discovered dependencies.

Build and test

Generating code is one of the most widest known generative AI use case, but it is important to be able to generate a set of related code artifacts ranging from IAC (Terraform or Cloud Formation Template), pipeline code/configurations, embed security design points (encryption, IAM integrations, etc.), application code generation from swaggers or other code insights (from legacy) and firewall configurations (as resource files based on services instantiated, for example). Generative AI helps generate each of the above through an orchestrated approach based on predefined application reference architectures built from patterns—while combining outputs of design tools.

Testing is another key area: Generative AI can generate the right set of test cases and test code along with test data so as to optimize the test cases being executed.

Deploy

Numerous critical “last mile” activities in modernization programs typically consume days to weeks, contingent upon the complexity of the enterprise. An essential Generative AI use case is the ability to derive insights for security validation by analyzing application and platform logs, design points, Infrastructure-as-Code, and more. This capability greatly expedites the security review and approval processes. Furthermore, Generative AI is instrumental in generating inputs for configuration management (CMDB) and change management, drawing from release notes generated through Agility tool work items completed per release.

While the aforementioned use cases hold immense promise throughout the modernization journey, it’s crucial to acknowledge that enterprise complexities demand a contextually orchestrated approach to leverage many of these Generative AI accelerators effectively. The development of enterprise-specific contextual patterns is an ongoing effort to accelerate modernization programs. We have observed substantial benefits from investing time and effort upfront and continuously customizing these Generative AI accelerators to align with specific patterns that exhibit repeatability within the enterprise.

Let us now examine a potential proven example:

Example 1: Re-imagining API Discovery with BIAN and AI for visibility of domain mapping and identification of duplicative API services

The problem: Large Global Bank has more than 30000 APIs (both internal and external) developed over time across various domains (for example, retail banking, wholesale banking, open banking and corporate banking). There is huge potential of duplicate APIs existing across the domains, leading to higher total cost of ownership for maintaining the large API portfolio and operational challenges of dealing with API duplication and overlap. A lack of visibility and discovery of the APIs leads API Development teams to develop the same or similar APIs rather than find relevant APIs for reuse. The inability to visualize the API portfolio from a Banking Industry Model perspective constrains the Business and IT teams to understand the capabilities that are already available and what new capabilities are needed for the bank.

Generative AI-based solution approach: The solution leverages BERT Large Language Model, Sentence Transformer, Multiple Negatives Ranking Loss Function and domain rules, fine-tuned with BIAN Service Landscape knowledge to learn the bank’s API portfolio and provide ability to discover APIs with auto-mapping to BIAN. It maps API Endpoint Method to level 4 BIAN Service Landscape Hierarchy, that is, BIAN Service Operations.

The core functions of solution are the ability to:

  • Ingest swagger specifications and other API documentations and understand the API, end points, the operations and the associated descriptions.
  • Ingest BIAN details and understand BIAN Service Landscape.
  • Fine-tune with matched and unmatched mapping between API Endpoint Method and BIAN Service Landscape.
  • Provide a visual representation of the mapping and matching score with BIAN Hierarchical navigation and filters for BIAN levels, API Category and matching score.

User Interface for API Discovery with Industry Model:

Key benefits: The solution helped developers to easily find re-usable APIs, based on BIAN business domains; they had multiple filter/search options to locate APIs. In addition, teams were able to identify key API categories for building right operational resilience. Next revision of search would be based on natural language and will be a conversational use case.

The ability to identify duplicative APIs based on BIAN service domains helped establish a modernization strategy that addresses duplicative capabilities while rationalizing them.

This use case was realized within six to eight weeks, whereas the bank would have taken a year to achieve the same result (as there were several thousands of APIs to be discovered).

Example 2: Automated modernization of MuleSoft API to Java Spring Boot API

The Problem: While the current teams were on a journey to modernize MuleSoft APIs to Java Spring boot, sheer volume of APIs, lack of documentation and the complexity aspects were impacting the speed.

Generative AI-based solution approach: The Mule API to Java Spring Boot modernization was significantly automated via a generative AI-based accelerator we built. We began by establishing deep understanding of APIs, components and API logic followed by finalizing response structures and code. This was followed by building prompts using IBM’s version of Sidekick AI to generate Spring boot code, which satisfies the API specs from MuleSoft, unit test cases, design document and user interface.

Mule API components were provided into the tool one by one using prompts and generated corresponding Spring boot equivalent, which was subsequently wired together addressing errors that propped up. The accelerator generated UI for desired channel that could be integrated to the APIs, unit test cases and test data and design documentation. A design documentation that gets generated consists of sequence and class diagram, request, response, end point details, error codes and architecture considerations.

Key benefits: Sidekick AI augments Application Consultants’ daily work by pairing multi-model generative AI technical strategy contextualized through deep domain knowledge and technology. The key benefits are as follows:

  • Generates most of the Spring Boot code and test cases that are optimized, clean and adheres to best practices—key is repeatability.
  • Ease of integration of APIs with channel front-end layers.
  • Ease of understanding of code of developer and enough insights in debugging the code.

The Accelerator PoC was completed with four different scenarios of code migration, unit test cases, design documentation and UI generation in three sprints over six weeks.

Conclusion

Many CIOs and CTOs have expressed reservations when contemplating modernization initiatives, citing a multitude of challenges outlined at the outset. These include concerns about the extensive SME involvement required, potential disruptions to the business due to change, and the necessity for alterations in the operating model across various organizational functions, including security and change management. While it’s important to acknowledge that generative AI is not a one-size-fits-all solution for these complex challenges, it undeniably contributes to the success of modernization programs. It achieves this by accelerating the process, reducing the overall cost of modernization, and, most importantly, mitigating risks by ensuring that no critical functionality is overlooked. However, it’s essential to recognize that introducing large language models (LLM) and related libraries into the enterprise environment entails a significant commitment of time and effort. This includes rigorous security and compliance reviews and scanning procedures. Moreover, improving the data quality used for fine-tuning these models is a focused effort that shouldn’t be underestimated. While cohesive Generative AI-driven modernization accelerators have not yet become ubiquitous, it’s expected that, with time, integrated toolkits will emerge to facilitate the acceleration of specific modernization patterns, if not an array of them.

Explore what you can do with generative AI with IBM Consulting.

Explore watsonx today
Was this article helpful?
YesNo

More from Automation

Deployable architecture on IBM Cloud: Simplifying system deployment

3 min read - Deployable architecture (DA) refers to a specific design pattern or approach that allows an application or system to be easily deployed and managed across various environments. A deployable architecture involves components, modules and dependencies in a way that allows for seamless deployment and makes it easy for developers and operations teams to quickly deploy new features and updates to the system, without requiring extensive manual intervention. There are several key characteristics of a deployable architecture, which include: Automation: Deployable architecture…

Understanding glue records and Dedicated DNS

3 min read - Domain name system (DNS) resolution is an iterative process where a recursive resolver attempts to look up a domain name using a hierarchical resolution chain. First, the recursive resolver queries the root (.), which provides the nameservers for the top-level domain(TLD), e.g.com. Next, it queries the TLD nameservers, which provide the domain’s authoritative nameservers. Finally, the recursive resolver  queries those authoritative nameservers.   In many cases, we see domains delegated to nameservers inside their own domain, for instance, “example.com.” is delegated…

Using dig +trace to understand DNS resolution from start to finish

2 min read - The dig command is a powerful tool for troubleshooting queries and responses received from the Domain Name Service (DNS). It is installed by default on many operating systems, including Linux® and Mac OS X. It can be installed on Microsoft Windows as part of Cygwin.  One of the many things dig can do is to perform recursive DNS resolution and display all of the steps that it took in your terminal. This is extremely useful for understanding not only how the DNS…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters