Datometry Hyper-Q

Adaptive Data Virtualization™ technology enables enterprises to run their existing applications on modern cloud data warehouses, without rewriting or reconfiguring them.

Why Hyper-Q

Datometry Hyper-Q lets enterprises adopt new cloud databases rapidly, control ongoing operating expenses, and build out analytic capabilities for faster digital transformation.

Simplified Approach

The conventional database migration approach of rewriting applications is not required

Decreased Risk

Eliminate the risks associated with rewriting and testing millions of application queries

Preserve Business Investment

Protect long-standing investments in the development of mission-critical business logic

Accelerated Time to Value

Realize BI savings swiftly by avoiding the expense of rewriting and testing applications

The Hyper-Q Solution

Datometry Hyper-Q virtualization software allows any existing applications to run on any cloud database, making applications and databases interoperable. Enterprises can now adopt the cloud database of choice, without having to rip, rewrite and replace applications.

Enables runtime application compatibility with Transformation and Emulation of legacy data warehouse functions.

Connects to any cloud data warehouse of choice – Azure SQL DW, Azure SQL DB, AWS Redshift, PostgreSQL, Google BigQuery, and Snowflake.

Deploys transparently on AWS, Azure, and hybrid cloud. Applications can use existing JDBC, ODBC and Native connectors without changes.

Enables runtime application compatibility with Transformation and Emulation of legacy data warehouse functions.

Connects to any cloud data warehouse of choice – Azure SQL DW, Azure SQL DB, AWS Redshift, PostgreSQL, Google BigQuery, and Snowflake.

Deploys transparently on AWS, Azure, and hybrid cloud. Applications can use existing JDBC, ODBC and Native connectors without changes.

The Datometry Advantage

Datometry Hyper-Q allows enterprises to run and manage applications in cloud databases, within multiple databases, in the cloud or between different cloud platforms, in weeks, not years.
Get StartedDownload Technical Brief

Frequently Asked Questions

Datometry® Hyper-Q™ enables enterprises to re-platform existing database applications to cloud data warehouses instantly and without costly migrations.

Datometry Hyper-Q is a SaaS offering that enables applications originally written for a specific database to run natively on a cloud database. Hyper-Q is the company’s core product that enables complete replatforming without the tedious, costly, and risk-laden rewriting of applications.
Hyper-Q intercepts communication off the network native protocol and returns bit-identical results for database applications. The solution translates query statements and results in real-time. The application can continue using the original driver and original SQL syntax. In this regard, the original application is perfectly preserved and operates natively with the new database as it did with the original database.
The overhead of Hyper-Q is negligible since it only translates statements and delegates the execution to the new database. In a benchmark test run, the Hyper-Q overhead was found to be less than 2%. Hyper-Q does not process any queries like a database which results in the very low overhead during the query execution. In some cases, Hyper-Q is able to increase efficiency by batching multiple singleton inserts into bulk operations and make the application suitable for the cloud data warehouse.
The execution time is primarily dependent on the new data warehouse. This gives customers the flexibility to control performance and execution directly through scale primitives of the new data warehouse.
The only change needed is in the configuration to use the IP Address/DNS name of Hyper-Q instead of connecting directly to the database. Applications will be pointed to Hyper-Q as the database host and the applications will run as they do today. No changes are required for the application logic or SQL code.
The solution is application agnostic, that is, it supports workloads emitted by BI, ETL, and reporting tools, custom tools and utilities, home grown scripts, ad-hoc queries, and so forth. Hyper-Q operates at the connector level and supports ODBC, JDBC and native legacy connections. 
Hyper-Q integrates with standard IT security infrastructure including Active Directory.
 Hyper-Q translation is the simple keyword replacement of legacy/source SQL to the new/target SQL; for example, the SEL keyword in use in some legacy data warehouses can be directly replaced with SELECT to achieve the same results and keep the query fully compatible with the target data warehouse.
Some SQL features, such as QUALIFY, RANK and MERGE, are not fully supported today on cloud data warehouses. Hyper-Q transforms these keywords in legacy/source SQL statements and uses the currently available SQL features/keywords of the new/target cloud data warehouses to achieve the same results. The transformation is a core function of Hyper-Q which enables replatforming without having to rewrite applications.
Some features—stored procedures, macros, updatable views and recursive queries—do not exist on the cloud data warehouses today. Hyper-Q provides full emulation of such features by using the available constructs on the cloud data warehouse. Stored procedures are typically not compute intensive and reside on the database to keep the data local to the database for efficient processing. Hyper-Q will break out the SQL statements within the stored procedure and execute the statements on the cloud data warehouse using temp tables. Customers who have experienced an application rewrite know that emulation is a fairly complex process. Since Hyper-Q handles emulation for the complex features, application migration is not a challenge and the business logic remains intact.  
There are lot of complex features that require emulation, for example, macros, stored procedures, and updatable views, due to which a one-time conversion or translation of queries is not a viable option for application migration. 
The very value proposition underlying Hyper-Q means POCs can be devised and executed with next to no effort for the prospect. Yet, the results are comprehensive as large workloads can be analyzed and executed rapidly.
Datometry strives to ensure its customers are successful. Therefore, Datometry continuously extends the already very extensive coverage of the supported source systems. The design of Hyper-Q’s powerful framework makes adding features based on new customer requirements a fast process.
Datometry provides a comprehensive error logging and handling mechanism. In the rare case of encountering an unsupported command or statement, the logs can be used directly to initiate a support ticket.
Yes, Datometry partners with a number of database vendors including Microsoft, Redshift, and Pivotal and is also certified for their technologies. The Datometry team is interested in working with other database vendors or business intelligence systems: email the Datometry team at info@datometry.com to learn more.

All query traffic passing through Datometry is automatically logged and traced to allow for auditing and managing workloads. The Hyper-Q logs can be exported to an external log aggregation system. This is a powerful value-add out-of-the-box for customers.
Datometry uses the RBAC mechanism of the target database to control which objects are accessible to users. This avoids having to maintain an external layer of permissions and configuration.
Yes, Datometry currently encrypts the messages exchanged during authentication handshake between the application and Hyper-Q. This can be extended to allow full encryption of the communication between application and the database.
Datometry Adaptive Database Virtualization™ (ADV) is a layer that abstracts the application and the database without having to rewrite the application. With ADV, the time to value is instant and the move to the cloud is relatively risk-free since the existing business logic is preserved.

The Logical Data Warehouse (LDW) is mainly used for the aggregation of data from different database systems and it requires an application rewrite to get the benefits of unified data access to the different systems. The LDW doesn’t simplify the database infrastructure – it extends the life of older databases while requiring an application rewrite to use the LDW interface/API. Customers are averse to changing databases since the replatforming requires a change or rewrite of business logic.

The LDW does not solve that fundamental challenge of app rewrite nor does it provide a straight path for the legacy warehouse replacement since the complex features have to be architected into the application with a rewrite. To a large extent, the LDW can be viewed as complimentary to ADV.

Contact the Datometry Team at info@datometry.com for a demo and learn more about BI modernization.

Supported Systems


Amazon Redshift logo

Amazon Redshift

Microsoft Azure SQL  DW logo

Azure SQL Data Warehouse

GreenPlum logo

Pivotal Greenplum

On Premises

Microsoft SQL server logo
Microsoft SQL Server

GreenPlum logo
Pivotal Greenplum