How database virtualization works
Delphix connects nondisruptively to databases -- the ubiquitous repositories for enterprise data -- and loads a compressed copy of the data into the Delphix Engine, shrinking that data by three times on average. Inside the engine, the Delphix file system (DxFS) compresses data blocks within database files and filters out empty or temporary blocks, minimizing the data footprint.
After the initial data seeding, Delphix maintains synchronization by collecting changes and tracking all versions for as long as required (weeks or months). From any point in time, Delphix can open one or more virtual databases (VDBs) that can be used for development and other lifecycle environments.
For an average application, businesses maintain more than seven lifecycle environments for development, testing, QA, integration, training, pilots, operational reporting, production support, user acceptance, system validation, and sandboxes -- not to mention redundant systems for backup, DR, and archiving logs. Instead of making and moving data copies over and over again, DxFS provides a virtualized view of databases by sharing the underlying data blocks across all environments and storing changes as new, unique blocks.
VDBs look and perform like normal, physical copies (users can add/drop tables, make schema changes, and run reports against the data), but include powerful features designed to accelerate application projects, such as virtual branches and fast data rollback or refresh.
Virtualizing databases fundamentally changes application testing and quality. In order to minimize cost and complexity, many organization test their applications using stale data that may be days or months old or use data subsets (nonrepresentative datasets) that can fail to test against a range of potential errors. With fast, automatic refresh and virtual databases that provide full, representative datasets, Delphix can dramatically improve the fidelity of QA and test environments.
Data version control
Developers have long used source code version control to track changes and work in parallel streams. Application projects that run on databases must pair release versions with corresponding datasets -- databases with the correct schemas and tables to enable application software to function properly. Databases, however, have traditionally been complex, slow, and hard to set up and maintain, which often forces application teams to settle for stale, partial, or shared data environments.
The Delphix Engine includes a second key technology component: the DataVisor, which provides efficient data synchronization (even across the WAN), full transactional consistency, integrated log shipping, and continuous versioning. With the DataVisor, a Delphix Engine can maintain synchronization with multiple source applications in near real time and automatically record and version all changes.
With a simple time slider, Delphix can quickly deliver a virtual version of a database at any point in time, down to the second or a specific transaction boundary. Instead of waiting weeks for data delivery, then running a test in QA, Delphix can reduce overall cycle times from weeks to hours, enabling faster testing and error detection.
Continuous versioning solves additional key challenges for enterprise applications. Many applications, like SAP, require data federation across multiple databases for data consistency; with the DataVisor, Delphix can deliver multiple databases at the same point in time in just a few clicks and minutes.
Fine-grained version control allows developers to reset a database to perform multiple comparative tests (A/B tests), create a library of retained versions that coincide with software releases, and quickly roll back to a previous stage during complex data conversion and mapping cycles.
Data changes constantly in databases, making it impractical to know ahead of time when a specific version will be required. With the DataVisor, Delphix automatically records all versions. If a DBA drops a table, it can be recovered in minutes from the most recent version available, minimizing downtime, data loss, and productivity or revenue impact.