Building Database Technology is Hard
It is true that building a database is difficult, especially if the aspiration is to build a database as good as any given commercially successful incumbent in the market, such as Microsoft SQL Server or Oracle. What makes the development of a new database so hard are the large feature surface and a couple of intrinsically difficult problems, such as query optimization or elements of distributed systems. Hence, new database companies have typically headed for a specific niche exploiting what can be called Stonebraker’s Theorem: There is no one-size-fits-all database, and, thus, any incumbent can be beaten with a special-purpose database if the use case is chosen carefully enough. All newcomers in the database market are doing just that—and usually quite successfully so, as Stonebraker would be quick to point out.
Adopting New Data Warehouse Technology is Extremely Hard
If the argument in the previous paragraph is true, then the following question arises: why aren’t the niche players cleaving up the database market and eating up sizable chunks? In fact, even the most successful database startups remain confined to somewhere less than 0.1% of the total addressable market. The reason for this limited success is not the technical difficulty, but rather the unwillingness of enterprises to embrace new database technology. This resistance is based on the very real risk inherent in taking on new data management technology, and that risk is concentrated in migrating mission-critical applications to a new platform. Just putting a new database into the corner of the data center isn’t a big deal, but committing the business core logic to a new database is a big deal.
Role of Virtualization Technology in IT Transformation
The past decade of IT transformation offers an interesting lesson. Underlying almost all major disruptions in enterprise IT has been a single concept: virtualization. The tight coupling of any components makes for a sheltered space in which vendor lock-in can flourish, which can be through proprietary protocols, programming languages, or instruction sets. Virtualization has proven to be an effective antidote, the trustbuster that cracked opened almost the entire IT stack. The only part of the IT stack that remains in the powerful strongholds of vendor lock-in is databases and the applications that use them.
As long as there is no convincing virtualization concept that separates applications and databases, the market will resist its disrupters, kill off innovation, and remain the business of a few large players. This is no space for unicorns—just yet.
Adaptive Data Virtualization Technology
Datometry is on the forefront of pioneering the missing piece of database virtualization called Adaptive Data Warehouse Virtualization with the idea of bringing to enterprise data management the same level of mobility, flexibility, and simplicity that other parts of the IT stack already enjoy.
The Datometry platform allows existing applications to run natively on alternate data warehouses without rewriting or reconfiguring the applications in weeks rather than years.The result is risk-free adoption of new enterprise data management technology—at a fraction of the cost of conventional database migrations. This has enormous impact on the adoption of cloud database technology and fundamentally alters the dynamics of the database market. For enterprises, Datometry is the simplest and fastest path to the cloud, and for cloud service providers and database vendors, Datometry is the ultimate accelerator of adoption of their technology.