The few unicorns on its fringes, such as Cloudera, are more about augmenting the market than taking actual market share away from the incumbents. My acquaintance was also quick to offer his explanation: “It’s simply too hard.” And while he’s not wrong, there’s much more to it than that.
Building a database is hard…
Yes, building a database is difficult, especially if the aspiration is to build a database as good as any given commercially successful incumbent in the market, such as Microsoft SQL Server or Oracle. What makes it so hard are the large feature surface and a couple of intrinsically difficult problems, like query optimization or elements of distributed systems. Hence, challengers have typically gunned for a specific niche exploiting what I’d call Stonebraker’s Theorem: There is no one-size-fits-all database, and, thus, any incumbent can be beaten with a special-purpose database if the use case is chosen carefully enough. All newcomers in the database market are doing just that—and usually quite successfully so, as Stonebraker would be quick to point out.
…but making your customers adopt your technology is much harder
Then why aren’t the niche players cleaving up the database market and eating up sizeable chunks? In fact, even the most successful database startups remain confined to somewhere less than 0.1% of the total addressable market. The reason for this “limited” success is not the technical difficulty, but rather the unwillingness of enterprises to embrace new database technology. This resistance is based on the very real risk inherent in taking on new data technology, and that risk is concentrated in migrating mission-critical applications to a new platform. Just putting a new database into the corner of the data center isn’t a big deal, but committing your business’ core logic to it? Now that is a big deal.
The lack of virtualization technology
So, what’s needed to break this stalemate? The past decade of IT transformation offers an interesting lesson. Underlying almost all major disruptions in enterprise IT in the last years has been a single concept: virtualization. The tight coupling of any components makes for a sheltered space in which vendor lock-in can flourish, be it through proprietary protocols, programming languages or instruction sets. Virtualization has proven to be an effective antidote, the trustbuster that cracked open the entire IT stack. Well, almost the entire stack: Databases and the applications that use them remain powerful strongholds of vendor lock-in and have very effectively defied disruption. And as long as there is no convincing virtualization concept that separates applications and databases, this market will resist its disrupters, kill off innovation, and remain the business of a few large players. This ain’t no space for unicorns—just yet.
And now this: Adaptive Data Virtualization
Enter Datometry. Datometry is on the forefront of pioneering exactly this missing piece of virtualization technology. We call it Adaptive Data Virtualization—the idea of bringing to the database arena the same level of mobility, flexibility and simplicity that other parts of the IT stack already enjoy. The result is risk-free adoption of new technology—at a fraction of the cost of conventional database migrations. And while we can’t guarantee that this will make database unicorns spring up all over the place, I can assure you that customers have long been waiting for this new-found independence!
Let us know what you think about the need to decouple applications and databases, and which systems you’d like to see liberated most!
Share this Post