When big data gets too big, this machine-learning algorithm may be the answer

mit quantum big data

Results of an analysis that would be too complex for conventional techniques but can be handled easily by the new quantum approach.

Credit: Courtesy of the researchers/MIT

Combining topology and quantum computing, it promises to put huge analyses within closer reach


Big data may hold a world of untapped potential, but what happens when your data set is bigger than your processing power can handle? A new algorithm that taps quantum computing may be able to help.

That's according to researchers from MIT, the University of Waterloo and the University of Southern California who published a paper Monday describing a new approach to handling massively complex problems. By combining quantum computing and topology -- a branch of geometry -- the new machine-learning algorithm can streamline highly complex problems and put solutions within closer reach.

Topology focuses on properties that stay the same even when something is bent and stretched, and it's particularly useful for analyzing the connections in complex networks such as the U.S. power grid or the global interconnections of the Internet. It can also help zero in on the most important features of a massive set of data.

The downside of topological analysis is that it's computationally very expensive, but that's where the researchers say quantum mechanics can help.

Here's an example: Say you have a data set with 300 points. A traditional approach to analyzing all the topological features in that system would require "a computer the size of the universe," noted Seth Lloyd, the paper's lead author and the Nam P. Suh Professor of Mechanical Engineering at MIT.

In other words, it would require roughly as many processing units as there are particles in the universe, effectively making the problem impossible to solve.

Tackling the same problem with the new algorithm and a quantum computer, on the other hand, would be much more manageable. In quantum computing, information is represented by quantum bits similar to the binary bits used in digital computing but able to assume not just "0" or "1" states but also both at once. Rather than a digital computer the size of the entire universe, the 300-point data set would require a quantum computer with just 300 quantum bits, and devices of that size could be available in the next few years, according to Lloyd.

"Our algorithm shows that you don't need a big quantum computer to kick some serious topological butt," he said.

The same approach could be used for analyzing the world's economy, social networks or "almost any system that involves long-range transport of goods or information," Lloyd said. Proof-of-concept experiments are already under way.

From CIO: 8 Free Online Courses to Grow Your Tech Skills
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies