3 Reasons why data virtualization might not solve your data problem

Azure Synapse breaks down all the data silos

12.03.20

Data virtualization is a curious concept that comes up now and then in customer conversations. Often in the form of “is insert name of data virtualization company, a competitor of yours?”. Clearly, there’s a lot of confusion out there. First off, it’s safe to say, whatever the company name is, it’s not a competitor of Datometry. But more importantly, Datometry is not a data virtualization system.

But how did we get here? As an IT leader, one of the most challenging problems you’re tasked to tackle in 2021 is moving critical data assets to the public cloud. In particular, you need to find a new home for the data warehouse. Getting the data warehouse to public cloud is synonymous with innovation. Staying on your existing system means falling behind your competition.

Unfortunately, there’s a general misconception that data virtualization might somehow help in this migration. However, it speaks more to a desperate need and the complexity of the problem than for an actual solution. The conventional approach of database migration is such a terrible ROI that everything else might just be a better solution?

In this article, we look at some of the misconceptions in this space and the top 3 reasons why data virtualization will not solve your problem of moving to the cloud.

#1: Virtualizing the data is not enough — Virtualize the system

In order to move to public cloud, abstracting the systems you’re currently using is the first step. After all, your plan is to escape from the current technology. And virtualization is the epitome of abstraction.

To begin with, the concept of data virtualization is a bit of a misnomer. After all, your data is the real thing. Even if it is a digital asset, you are not trying to replace or somehow simulate the data. With regulations like GDPR, the authenticity of data and the chain of custody are more important than ever. Remember, what you really want to replace is the system that manages your data.

Data virtualization software is trying to be something very different from a system abstraction, though. It aims at being a universal conduit for data access. It promises to connect and integrate all your existing systems. To do so, data virtualization systems implement their very own database architecture. It’s an overlay database on top of your existing databases. Think of it as an uber-database.

In contrast, Datometry virtualizes the database system. We provide your applications with a virtual database system implemented by new cloud data warehouse technology. For instance, Datometry lets you use true cloud data warehouses as if it were a Teradata appliance. That is, you still get the same functionality while moving to your preferred cloud data warehouse—without any Teradata in the picture anymore.

#2: Enforcing a new SQL dialect does not eliminate vendor lock-in

Ask any database customer what they consider the single most restricting aspect of their current database, and you typically get one answer: vendor lock-in. Existing applications are “locked-in” to the database for which they were originally developed. These business-critical applications represent years of investment.

For data virtualization to unlock any value, all applications must first speak the same language. They must be migrated to the same system. This means you are looking at a massive migration project. Data virtualization forces the rewriting of all applications to make them work on the new platform.

But it doesn’t end here. All new projects must go through the data virtualization platform, too. This may be an option when an entire architecture is created from scratch but is rarely an option for large IT organizations, in practice.

In contrast, Datometry makes existing applications work as-is with new cloud data warehouse technology. The very idea behind Datometry is to spare you the drama and chaos of a database migration. You break the lock-in of your legacy vendor with little effort. And you don’t trade it for a new one.

Better still, Datometry does not keep you from tapping the true value of the cloud platform. New application development can happen outside of Datometry. This gives you the best of both worlds. Bring your existing applications while avoiding disruptions to your business. All the while, position yourself for a new era of application development. Your line-of-business users will appreciate the former and get excited about the latter.

#3 Standardizing on a common denominator kills competitive advantage

Finally, there’s a natural limitation to data virtualization. In order to be universally plug-and-play, data virtualization systems must ensure you’re using functionality that’s common to all your existing platforms. Let that sink in. Data virtualization is only as good as the most limiting system you are using today.

Since database systems differ in functionality, this has the potential to stifle your innovation severely. You won’t be able to use powerful features of a database unless they are also implemented in your data virtualization system, which is impossible not only because of lag times to adjust the software accordingly but because of inherent incompatibilities.

In contrast, Datometry lets you use a cloud data warehouse in both ways. First, as if it were Teradata and, secondly, natively as the very cloud data warehouse. All of the commonly used Teradata functionality is now available to you instantly, even on a cloud data warehouse that doesn’t even have, say, macros or stored procedures. Instead, you get to use the full array of 40+ years of Teradata functionality.

At the same time, you can build new apps on the cloud data warehouse directly and use highly specialized features for your new apps. No need to worry about breaking any abstractions or anybody holding you back.

Datometry — a new kind of virtualization

IT leaders are looking to abstract their systems in preparation for going to public cloud. Datometry does just that: it virtualizes the existing data warehouse. Sure, this new kind of virtualization is technically challenging. A trail of research papers and patents is a testament to the effort that went into its development.

However, nothing is as gratifying as a customer’s incredulous reaction when they see it in action. Customers like to give us their most challenging workloads in POCs. The confidence and trust they develop when they see it work are priceless.

Are you looking to liberate yourself from either an appliance or wanting more from your cloud than just a database-in-the-cloud? Datometry and its capability of virtualizing the system—not the data—can unlock a whole range of opportunities.

Contact your cloud provider to see if you qualify for an assessment. Get an in-depth analysis of literally every single statement you’re running on your Teradata system today. See how it maps to your cloud data warehouse, and start your journey into the cloud.

Replatform to a modern cloud data warehouse in 6 simple steps
Learn More
Author avatar

About Mike Waas

CEO

Mike Waas founded Datometry with the vision of redefining enterprise data management. In the past, Mike held key engineering positions at Microsoft, Amazon, Greenplum, EMC, and Pivotal. He earned an M.S. in Computer Science from the University of Passau, Germany, and a Ph.D. in Computer Science from the University of Amsterdam, The Netherlands. Mike has co-authored over 35 peer-reviewed publications and has 20+ patents on data management to his name.

Continue Reading