by: Mike Waas

Application Modernization: The Good, the Bad, and the Ugly

On this page:

Enterprise IT leaders are facing enormous challenges in how best to modernize their database or data warehouse applications as enterprises are moving to the cloud and on-premises databases need to be replaced with cloud-native systems. Database application modernization is a logistical nightmare because it means having to modify potentially 1,000s of existing applications with each application posing specific challenges.

The benefits in terms of competitive advantage, better economics, and ease-of-use of cloud databases are universally recognized across the board by now; however, modernizing database applications to support new cloud databases is all too often a zero-sum game with limited benefits that often leaves applications compromised in the process.

As entire industries are readying themselves to move to the cloud, database application modernization is becoming a hot button issue every IT leader has to face in the near future while keeping in mind that conventional techniques are time-consuming, costly, and extremely risk-laden.

This article discusses the following topics:

  • Understanding the Difficulty of Application Modernization
  • Application Modernization is Inherently Counter-Productive
  • Adaptive Data Virtualization: An Effective Alternative Approach

Understanding the Difficulty of Application Modernization

Application modernization, the process of making an application work with a database different from the one it was originally written or configured for, is really a misnomer: Instead of modernizing and improving the functionality, application modernization is primarily about changing applications to work around functional discrepancies between old and new database.

New and emerging technology is often limited at first in functionality when compared to the much mature, older technology—a fact much worshipped in the tech industry and described in detail by Harvard economist Clayton Christensen in his book, titled, The Innovator’s Dilemma. Cloud databases are no exception: as a disrupter of a decades-old industry, they offer new dimensions of benefits and advantages, such as scalability and ease-of-use, but have yet to reach full functional parity with long-established, on-premises systems.

Database modernization projects actually burden applications with additional workarounds that inadvertently increase complexity and provide substantial room for error.

Consequently, database modernization projects actually burden applications with additional workarounds that inadvertently increase complexity and provide substantial room for error. To illustrate the risk and harmful nature of conventional migrations, this article examines scenarios of varying degrees of difficulty when modernizing SQL queries and categorizes them into three groups:

  • The good
  • The bad
  • The ugly

Each group is discussed in detail in the following subsections.

The Good in Database Application Modernization

A large group of syntactic modifications fall into this category and as they are easy to spot, IT practitioners are quick to offer rewrites. Most prominently, this category includes discrepancies in keywords, such as abbreviations, or workarounds for admittedly elegant shortcuts the new database does not, or not yet, offer.

Workarounds for this type of issue are typically straightforward and pose limited risk. Even though at first sight they might appear as pure textual differences, few can be dealt with using textual manipulation tools only, for example, Regular Expressions, a widely used tool often considered a programmer’s Swiss army knife for text manipulation.

However, what is common to all of the queries in this category is that the adjustments needed are of some local nature, that is, they can be made without a holistic understanding of the query.

The Bad in Database Application Modernization

Queries in this category use powerful, often non-standard constructions that require elaborate rewrites and true subject matter expertise when adjusting them for cloud databases. Examples in this class include proprietary idioms like Teradata’s QUALIFY or extensions that pre-date standards and, therefore, deviate from the language definition of virtually any other SQL dialect. A subtlety that has proven particularly error-prone are system-specific interpretations of standard clauses such as the ordering of NULL values relative to other data; the same query may return different results when executed on the on-premises database and its cloud counterpart, respectively.

In contrast to the application queries in the good category, these queries require a full and detailed semantic understanding of all components and, unlike the previous category, changes are no longer locally contained but usually need a complete restructuring of the original query.

The Ugly in Database Application Modernization

Finally, an even more complex class of database application queries are those that contain advanced—and, often, proprietary—control flow features, such as recursive queries or stored procedures. If the cloud database does not provide primitives of similar expressivity, a manual rewrite will require more than just an adjustment of the SQL text. Rather, the application code in which the SQL text is embedded needs to be modified to compensate for the lack of expressiveness.

In this case, even the surrounding application code is exposed to the risk of introducing software defects. The problem is usually compounded by the fact that the original application developers may no longer be with the company and code that has not been touched in years needs to be modified.

Application Modernization is Inherently Counter-Productive

The problems described in the previous sections reflect the current state-of-the-art in an industry that is bracing for a tsunami of enterprise data management re-platforming initiatives over the next decade as a consequence of a global movement to the cloud. At the core of the problem is the notion of modifying database applications to make them work with a new database and executing the modifications as quickly as possible and in a way that causes the least disturbance to the business. In this light, the term modernization seems like a euphemism as applications are mangled and IT professionals need to cut corners given the vast number of applications that need to be transformed in each large re-platforming initiative.

This approach can be referred to as static as it attempts to make changes to applications in a stand-alone manner and is independent of the application behavior at run time. Static application modernization besides being inherently insufficient and impacting the business negatively, also has the following significant shortcomings:

  • Static modernization adapts existing business processes to the current version of the target database and ignores the fact that cloud databases are undergoing drastic and very rapid development. Changes introduced to deal with any current limitations of the database will need to be revisited at a later point, effectively making another modernization necessary.
  • Defects—inevitably introduced in any such large-scale project—proliferate insidiously in static modernizations as templates and idioms are widely copied across all affected applications. The risk is further exacerbated by the large number of personnel involved which requires institution of processes and often hampers proper enforcing of best practices and quality control mechanisms.
  • Enterprises effectively trade the vendor lock-in they have been seeking to escape—in some cases for decades already—for yet another vendor lock-in. Obviously, this is done in the hopes the new target platform will prove to be the right choice and no further migration is needed, which, given the rapid market dynamics in the cloud can hardly be thought of as a sound risk mitigation strategy.

The risk, time and cost incurred by static modernization make the overall approach unattractive. Lastly, the manual nature of the approach appears out of touch with an ever-increasing level of automation in all other areas of IT.

The effective alternative to the above challenges, is Adaptive Data Virtualization, a category-defining virtualization technology, that promises to eliminate the need for conventional migration procedures.

Adaptive Data Virtualization: An Effective Alternative Approach

In Adaptive Data Virtualization (ADV), the underlying principle is reversed: instead of trying to modify a database or data warehouse application—bound to be a losing proposition in the immediate future or in the long run—ADV intercepts and changes the communication between application and database in real-time. Applications written or configured for on-premises databases can run instantly on any cloud database. What this means for the enterprise is that database applications remain intact, years and decades of investment in application development are preserved, and IT leaders can implement a cloud strategy rapidly to reap the benefits from adopting a specific cloud in short order.

With ADV, IT leaders have extremely powerful technology on their hands to mitigate and reduce risk to both the business and their own careers.

Learn more about adopting cloud databases without rewriting applications in this White Paper, titled, Re-Platforming Data Warehouses – Without Costly Migration Of Applications.

About Mike Waas CEO

Mike Waas founded Datometry with the vision of redefining enterprise data management. In the past, Mike held key engineering positions at Microsoft, Amazon, Greenplum, EMC, and Pivotal. He earned an M.S. in Computer Science from the University of Passau, Germany, and a Ph.D. in Computer Science from the University of Amsterdam, The Netherlands. Mike has co-authored over 35 peer-reviewed publications and has 20+ patents on data management to his name.