"If your data does not fit in main memory now, wait a year or two and it probably will," said Michael Stonebraker, a pioneer in database development who is chief technology officer at VoltDB, the company overseeing his latest database of the same name. "An increasing fraction of the general database market will be main-memory deployable over time."
Stonebraker's opinion is echoed by others in the business.
"If you have a bit of a performance pain, and you have the budget to pay for a market leading, general purpose, relational database management system anyway, then it makes sense to manage the data in RAM," said Monash Research head analyst Curt Monash.
Stonebraker admits the approach wouldn't work for all databases. With today's memory prices, keeping a 100GB or even a 1TB database in main memory is not prohibitively expensive, depending on the level of responsiveness required and amount of money that an organization is willing to spend.
"It is still too early to call it commodity, but in-memory is becoming commodity in a way," said Nicola Morini Bianzino, managing director for the SAP Platform Solutions Group at Accenture.
Bianzino said he has seen a shift in the questions he gets from clients and potential clients about in-memory over the past six months. "The questions are shifting from 'What is it?' to 'How do I do it?'" Bianzino said.
"The message has gone to the market and has been assimilated by clients," Bianzino said. "This doesn't mean they will move everything tomorrow, but they are taking it for granted that they will have to move in that direction."
With SQL Server 2014, Microsoft's approach to in-memory is to bundle it into its standard relational database platform."We're not talking about an expensive add-on, or buying new hardware or a high end appliance," Kelly said.
The SQL Server in memory component can be used for online transaction processing, business intelligence and data warehousing.
The interesting thing about in-memory is not only that it can expedite current database operations, but actually create entirely new lines of business, Kelly said.
As an example, Kelly pointed to online auto parts reseller Edgenet.
Using a beta version of SQL Server 2014, "Edgenet has been able to transform its business to respond much faster to competitive threats, by enabling dynamic pricing on their website," Kelly said. The company can change prices of goods for customers in a given market, based on what the latest prices are from regional competitors, which may run spot sales on certain items.
Although dynamic pricing can be done with a standard relational database, the practice could lead to contention issues, in which updating the prices may slow the response to the end users to the point where they may not get the price quite immediately, Kelly said.
"In the way SQL Server does memory, it eliminates latching. So the user does not see any delays as they access the system," Kelly said. (Currently, Edgenet is going through a Chapter 11 bankruptcy protection, so perhaps the use of dynamic pricing will help the company regain its footing with the regional competitors).
Pivotal's newly launched Gemfire HD extends in-memory databases to big data, through its integration with the Apache Hadoop data-processing platform.
"Gemfire is essentially a SQL database in-memory that can pull data from the Hadoop File System [HDFS], or persist data down into HDFS," said Michael Cucchi, Pivotal senior director of product marketing.