Welcome!

Microservices Expo Authors: Liz McMillan, Todd Matters, Pat Romanski, Elizabeth White, Stefana Muller

Related Topics: Microservices Expo

Microservices Expo: Article

Extracting UML from Legacy Applications

Modeling existing business processes to extend their value

The operations of many large organizations rest on large applications that are characterized as "legacy." To increase flexibility or reduce costs businesses are looking to modernize these applications, for instance, via renovation, introducing an SOA architecture, or even re-implementing in a new environment. No matter which approach is taken, it's important to salvage as much knowledge and logic as possible from the legacy application. Unless the application's function is obsolete recovering functional knowledge (what does the application do?) and structural knowledge (how does it do it?) can accelerate the modernization effort.

A parallel can be drawn with renovating a building, since modernization can involve gradual changes to the building's internal structure, say, larger doors, or complete demolition and reconstruction. In both cases blueprints of the buildings are required. These blueprints form a shared basis of knowledge between the architect and the developer that's necessary for planning and execution. Just as blueprints are necessary to determine how a building can be adapted to suit a new need, they are also necessary to determine how to adjust an application. However, since legacy applications have, by their nature, been developed over long periods of time and, in most cases, by many people such blueprints don't exist. They have to be recovered to start the modernization process.

Many legacy analysis tools have emerged to address this situation. They attempt to reveal the internal and implicit architecture and business function of a legacy application in a fashion understandable for people. The discovery and then the expression of the business function at an abstract level that discards incidental technical details is a difficult task. However, this is precisely what is necessary since modernization projects are interested in re-implementing business functions, not particular technical solutions.

The tension between expressing business functions in a comprehensible fashion and expressing them with enough detail and precision has to be addressed. This requires a language in which business knowledge can be practicably expressed.

UML as a Solution
UML has emerged as the most successful non-proprietary object modeling and specification language. UML includes a standardized graphical notation that can be used to create an abstract model of a system, that is the UML model. UML can precisely and understandably describe an application at a high level of abstraction, hiding the implementation details to reveal the actual business functions. Furthermore, as a formal language, UML has a well-defined syntax that makes it suitable for both forward and reverse engineering. In forward engineering, a UML model could be used to generate actual code (e.g., Java code). Most commercially available UML tools are capable of forward engineering (in various degrees), while possessing some reverse engineering features. The most common reverse engineering activity involves extracting class diagrams from Java code. However, reverse engineering is limited so far to modern programming languages, while lacking similar capabilities for older technologies like COBOL.

The Benefits of UML vis-à-vis Legacy
The ability to reverse engineer legacy applications to UML would offer substantial benefits:

  • Preservation of Knowledge at an Abstract Level
    In many instances it may be necessary to re-implement an application in a new and modern technical environment. However, because the application represents years of organizational learning it contains functionality that must be preserved. The information must be described at a high enough level of abstraction to avoid unnecessary technical details. UML offers the right level of abstraction so that the author can retain as much detail as necessary for future implementation.
  • Communication Mechanism
    Distance, language, and cultural issues can impair the communication of information about business functions between the legacy maintenance team and the re-implementation team. UML offers developers a common language that promotes precision and comprehensibility.
  • Vendor Independence
    The universal acceptance and adoption of UML also offers the advantage of choosing from a large number of UML tool vendors. One team can select Rational Rose to create the models while an outsourced re-implementation team can use Borland to view and reuse the same models. Users can even switch between tools and still preserve the knowledge encapsulated in the models.
  • Forward Engineering
    While the challenge to be discussed is reverse engineering legacy applications into UML, we should recall that many UML tools offer forward engineering capabilities.

Thus UML can be the intermediate stage through which an application rewrite may pass. The process would thus be:

"People Reuse"
The standard method of acquiring business process knowledge through user interviews involves a major commitment of time and new resources. The results may be incomplete and error-prone. Alternatively, extracting knowledge directly from the current application circumvents these concerns while continuing to rely on the resources currently involved in application maintenance.

Limitations
There's no silver bullet for reverse engineering legacy to UML. In fact, one may notice that some concepts simply don't match. Furthermore, some important information about the use of the legacy application isn't captured in the code itself, making automatic extraction impossible. For instance:

Actors
In UML, an actor is a user of the system; "user" can mean a human user, a machine, or even another system that interacts with the system from outside its boundaries. Because of this definition in most cases information about actors can't be found in the code.

Requirements
While the current system may implement business requirements, they may not appear explicitly in the code, but rather in the form of fulfilled requirements.

Navigation & Sequence
Certain sequences of operations may not be explicitly specified in the application. So, while a user knows that to open a new account, he or she must perform activities A, B, and C in this precise order, the application may allow other paths that aren't meaningful from a business perspective.

We can therefore recognize that any UML description of a legacy application can't be achieved through completely automatic reverse engineering. While a legacy analysis tool may expose the artifacts of the application, only a human can assemble them into meaningful UML diagrams.

A Balanced Approach
We have shown that a totally automated approach isn't feasible. At the other extreme, a completely manual approach has two primary disadvantages:

Economics
Over time applications tend to be modified to such a degree that neither the initial plans, nor the current documentation reflects the reality of the application. Knowledge must be acquired from the code itself, but to manually review a multimillion-line application would be far too burdensome financially to be a realistic option.

Completeness
As a legacy application is modified and enhanced over the years users often lose a complete understanding of how the application functions. For example, in a pension system the rules for computing the pension can be spread through numerous government and corporate policy documents. This knowledge is already in the code, which is more complete, precise, and concise than what would come from user interviews. Moreover, the application stakeholders are likely to insist that nothing is lost from the current functionality.

The best balance between fully manual and fully automatic can be called "tool assisted." In this approach, a software tool may be able to:

  1. Parse legacy code and show its information in a convenient manner. Convenience is key since the selected tool would have to filter out a great deal of unnecessary detail while assembling the application information that is actually needed.
  2. Allow the user to select relevant legacy artifacts and quickly derive UML entities. For example, the tool may show a list of COBOL structures that, when clicked on, create a class with the data members derived from the fields of the structure.
  3. Let the user quickly assemble UML diagrams based on derived entities. Further to point 2, once two classes are created the user should be able to indicate a relationship that would appear in a class diagram.
  4. Export data in standard XMI notation. Doing so would ensure that the user would end up not only with just attractive diagrams but useful models that can be refined by UML tools and used for the forward generation of the code.
These features may involve various degrees of automation. The automation appears wherever the extraction or deviation is clear and algorithmic. However, there are remaining steps that require human intervention to give meaning to the resultant models.

What UML Diagrams Can Be Extracted?
We have now identified the approaches that yield the maximum benefit and their drawbacks. So let's look at specific information that can be extracted from the legacy application. These possibilities should be thought of as a starting point since more automation will likely arise as UML extraction tools increase in sophistication.


More Stories By Richard Soley

As chairman and CEO of OMG and executive director of the SOA Consortium, Dr. Richard Soley is responsible for the vision and direction of the world's largest consortium of its type. He joined the nascent OMG as technical director in 1989, leading the development of OMG's world-leading standardization process and the original CORBA specification. In 1996, he led the effort to move into vertical market standards and modeling, leading first to the Unified Modeling Language and later the Model-Driven Architecture. Previously, Dr. Soley was a cofounder and former Chairman/CEO of A. I. Architects, Inc., maker of the 386 HummingBoard and other PC and workstation hardware and software. He holds a BS, MS, and PhD in computer science and engineering from MIT.

More Stories By Mike Oara

Mike Oara is CTO, Relativity Technologies.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@MicroservicesExpo Stories
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers bala...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
There's a lot to gain from cloud computing, but success requires a thoughtful and enterprise focused approach. Cloud computing decouples data and information from the infrastructure on which it lies. A process that is a LOT more involved than dragging some folders from your desktop to a shared drive. Cloud computing as a mission transformation activity, not a technological one. As an organization moves from local information hosting to the cloud, one of the most important challenges is addressi...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Microservices are increasingly used in the development world as developers work to create larger, more complex applications that are better developed and managed as a combination of smaller services that work cohesively together for larger, application-wide functionality. Tools such as Service Fabric are rising to meet the need to think about and build apps using a piece-by-piece methodology that is, frankly, less mind-boggling than considering the whole of the application at once. Today, we'll ...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In the decade following his article, cloud computing further cemented Carr’s perspective. Compute, storage, and network resources have become simple utilities, available at the proverbial turn of the faucet. The value they provide is immense, but the cloud playing field is amazingly level. Carr’s quote above presaged the cloud to a T. Today, however, we’re in the digital era. Mark Andreesen’s ‘software is eating the world’ prognostication is coming to pass, as enterprises realize they must be...
A common misconception about the cloud is that one size fits all. Companies expecting to run all of their operations using one cloud solution or service must realize that doing so is akin to forcing the totality of their business functionality into a straightjacket. Unlocking the full potential of the cloud means embracing the multi-cloud future where businesses use their own cloud, and/or clouds from different vendors, to support separate functions or product groups. There is no single cloud so...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Hybrid IT is today’s reality, and while its implementation may seem daunting at times, more and more organizations are migrating to the cloud. In fact, according to SolarWinds 2017 IT Trends Index: Portrait of a Hybrid IT Organization 95 percent of organizations have migrated crucial applications to the cloud in the past year. As such, it’s in every IT professional’s best interest to know what to expect.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
Companies have always been concerned that traditional enterprise software is slow and complex to install, often disrupting critical and time-sensitive operations during roll-out. With the growing need to integrate new digital technologies into the enterprise to transform business processes, this concern has become even more pressing. A 2016 Panorama Consulting Solutions study revealed that enterprise resource planning (ERP) projects took an average of 21 months to install, with 57 percent of th...