To be, and not to be—is that the answer?

If you believe much of the current reporting on quantum computing, it's around the corner. Here's how quantum computing works and how realistic it is that it will be available any time soon

1 2 Page 2
Page 2 of 2

So, when you have a 64-qubit quantum computer, you measure the computation result as a pretty random 64-bit digital number, unless your quantum algorithm is such that the probability of reading the wrong answer is zero. The operation of the steps of the algorithm must lead to a state of all those qubits combined, where every qubit that should be 0 for the correct answer must be so with 100 percent certainty and every qubit that should read as 1 for the correct answer must be so with 100 percent certainty: unless your quantum algorithm makes sure all the wrong answers have a zero probability of being the read result, measuring the state of your quantum computing (reading the answer) will likely give you some wrong random result. This is the main reason why the intuitive but false explanation above, and the number 18,446,744,073,709,551,616 is wrong. Your quantum algorithm that is made up of a successive number of quantum operations on those qubits needs to make sure that the chance of reading (measuring) a wrong answer is (almost) zero.

So, what you need is an algorithm that does precisely this: have quantum operations on your qubits such that—because during operation all the wrong states for qubits have canceled out—only a right answer can be measured (almost) deterministically at the end. This is possible because the waves that define the qubit may interfere with each other based on operations on the qubit. These interferences may be destructive (canceling out) or constructive (boosting), just any wave can interfere with another wave. That way, operations on qubit can drive it to either a 0 percent or a 100 percent probability to being measured in a certain state. A qubit interferes with itself and with the operations performed on it. An operation on a qubit, by the way, is something like shooting a laser or a radio wave at it.

And this need to have algorithms that cancel out’wrong answers really limits what you can do with quantum computers, because there are only a few specific problems for which such algorithms exist. One such an algorithm is famous. It is called Shor’s Algorithm and it is a quantum algorithm that can factor integers in reasonable time. Given that much of encryption and thus security assumes that it takes an impossible amount of time to factor very large integers (say, the 1,300 digits of modern-day encryption values), this quantum algorithm has of course spiked a lot of interest. But there are not many others and it is unclear if there will be many.

To drive the message home: a quantum computer is not some sort of superfast general computer. It is a computer for a very limited and specific set of problems, and there are only a very few algorithms that may do something useful. What problems? Which algorithms? Much of that is still up in the air. The most likely use is probably emulation (calculation) of quantum physics itself, as those calculations often stymie digital computers.

Where we are today

I’m ignoring the D-Wave “quantum annealing” system in this story. Mainly because it is pretty much unclear if such a system actually delivers any speedup from true quantum mechanical effects. Researchers so far have come up empty. It is questionable if it is really a quantum computer, even if it might use some quantum mechanical effects. It might be better seen as dedicated hardware (potentially quantum mechanics in nature) to solve a useful but very limited type of calculation. On me, but I’m no expert, with its optimum-seeking behavior it evokes more the image of an analog computer.

So, where do we stand in terms of actual real quantum computing? Well, in 2011 Shor’s Algorithm has been able to factor 21 into 3*7 using 2 qubits. Wow, Bob, wow. With a mechanism that is better than Shor’s Algorithm, a four-qubit quantum computer has been able to factor the number 143 (11*13). Later researchers proved that that algorithm could actually factor larger numbers as long as those factors only differed in two bits (what researchers then called “a special class of numbers”). 56,153 was in 2014 the largest number factored, though with another quantum computing technology than the one used for Shor’s. We’re still really far from factoring today’s 1,300-digit numbers at the root of encryption in reasonable time.

IBM recently published research that there is a class of problems for which you do not need as much depth (number of steps) on a quantum computer as you need for the same problem on a digital computer. This has widely been reported on the net as “IBM proves a quantum computing advantage over digital computing” and you may walk away with the idea that quantum computing is around the corner. But in fact, what IBM has found is—again—a small set of problems that may not require as much depth (number of steps) on quantum computing as it does on digital computing. Yes, fewer steps are an advantage. But small set (of what already is a small set) is a definitive disadvantage. And is there actually a problem in that set that has a practical application? We don’t know. Maybe. If we are lucky.

Most of quantum algorithms still run on simulations of actual quantum computing hardware, simulations that run on digital computers that do not have the actual capability of truly simulating the underlying hardware because of digital’s limitations with respect to both quantum’s superpositions (so the simulations are slow) as well as with respect to the actual real math that is required to model the underlying physics. 

And what about actual quantum computing hardware? Well, Google is testing a 72-qubit quantum processor with a 1 percent readout error rate, and it hopes to be able to keep it coherent for long enough to get over 40 steps from it (depth). That may get Google quantum computing that for a very specific type of problem would outperform a digital computer for the first time. Sounds, great, doesn’t it?

Not so fast. Have a look at the following image from Google's AI blog:

qubits error range Google AI blog

The green area is where the actual useful error-corrected quantum computing is. We need about 100,000 as many qubits per processor and in a coherent state for an extended time period for algorithms to have some depth. You need one to two orders of improvement on the error rate. The blue area might be in reach, however, so some very specific problems may see quantum computing outperform digital computing sooner. Again, the graph is what Google intends, not what it has done. A bit like still not having solved that “labeling dark people as gorillas” problem in three years.

So, not only are what we see now mostly intentions, there are several other hurdles and important notes. 

  • We may be up against fundamental physical limitations, comparable with the hard limit of light speed.That is, can we overcome decoherence at larger scales? In the words of a 2012 paper: “Is controlling large-scale quantum systems merely really, really hard, or is it ridiculously hard?” and follows up with “In the former case, we might succeed in building large-scale quantum computers after a few decades of very hard work. In the latter case, we might not succeed for centuries, if ever.”
  • What kind of problems are actually feasible to calculate using quantum algorithms (and better than digital)? The set is interesting, but also limited. A quantum computer will not be a general-purpose computer; it will have specific advantages for specific mathematical problems. From that same article on one such class of problems: “It seems that ... speedups are possible only for problems with special structure well matched to the power of a quantum computer.” In fact, what quantum computers may be best at is doing calculations about quantum physics. A bit like the analog computers that were electronic analogs for mechanical problems (such as target calculation for large guns), and definitely not what is in the mind of hype-peddlers.
  • There are the problems where you cannot check correct operation of a quantum computer at all other than running it on another quantum computer or performing a real-world experiment. That kind of sounds like the hidden-proxy problem of my previous article. Yes, the answer is 42, but we have no idea what it means.
  • The necessity of error correction is one of those areas where we quickly descend into the edge of knowledge or even unknown physics. To create states that both show quantum coherence and stability, theorists dive into worlds of two-dimensional quasiparticles or even worlds with four spatial dimensions. Um.
  • And even without that, assumptions that have been made based on approximating mathematical models (physicists almost always have to throw out “negligible” terms or they can’t calculate anything at all) may not hold in reality where these “negligible” terms interact and are not independent. This may have detrimental effects on the possibility of real quantum error correction. Things may be much harder in reality than approximate calculations so far have shown (from a January 2019 article by some very smart Finns and Germans). Another example of how using approximations (just like the ones used by digital computers to approximate real numbers) can be fraught with dangers.

As it turns out, some of the open questions require physics that we don’t have yet, and trying to build quantum computing may in a way be seen as an experiment to get at new underlying fundamental physics regarding the reality we are part of, only formulated in a way that large companies such as Microsoft, Google, and IBM are willing to put a lot of money in it. You have to admire the physicists for their capability of getting money for physics. Or maybe it is just that in the end every physical subject (including computing machinery) becomes a physics subject.

So, before believing statements like “before 2028, quantum computers will transform IT and logistics,” it’s best not to put your hopes too high for that foreseeable time period. I’m not a big fan of trying to predict the future (you’re lying, even if you’re telling the truth) but this is probably going to take decades of hard work and success is far from certain. That statement is pure hype. Really.

Copyright © 2019 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
How to choose a low-code development platform