Raise Your Hand and Ask: What’s a Qubit?

Most people don’t want to be the uncool one to raise their hand and ask a question, but in many cases we really should. These occasional “Raise Your Hand and Ask” posts highlight cool “buzzwords” you may have heard.

istock 664390112

Note: Most people don’t want to be the uncool one to raise their hand and ask a question, but in many cases we really should. These occasional “Raise Your Hand and Ask” posts highlight cool “buzzwords” you may have heard. My aim isn’t just to explain what they mean (that you can look up), but also why they matter.

You may have heard of quantum computing, and the related term qubit, but what is a qubit?

Wikipedia describes a qubit as the quantum analogue of the classical bit (can store a 0 or a 1).  In other words, normal computers have bits, and quantum computers have qubits (can be a 0 or a 1 or both at the same time per quantum mechanics!)

Superposition and entanglement

The “magic” in a quantum computer is that it makes direct use of quantum-mechanical phenomena, notably superposition and entanglement. This means two things for a qubit:

  • a qubit is not a 0 or a 1 until it’s observed; until then, it can be both (superposition)
  • multiple qubits can be entangled with each other (affect each other at any distance)

There is nothing in our everyday lives that prepares us for these two magical properties to be real. Therefore, I know of no intuitive explanation for why these phenomena exist.


A big challenge with quantum computing is fragility. Simply observing the state of a qubit changes it, and a qubit is also extremely difficult to isolate from outside noise that would also change its state. Solving these problems requires exotic technology and additional breakthroughs.

What can a quantum computer do?

Hype might suggest that quantum computers will replace all computers because they are so much faster. A more realistic assessment seems to be that quantum computers will be extraordinarily faster at some specific tasks (polynomial [P] and some non-polynomial [NP] problems) but not extraordinarily faster for most problems. There is no known way to solve an NP-complete problem efficiently, even with a quantum computer.

We already know that there are efficient quantum solutions for discrete logarithms and factoring. Quantum simulation will also be a compelling use for quantum computers. D-Wave Systems is focused on combinatorial optimization, and they often mention the traveling salesman problem as a motivational example that is NP-hard (not NP-complete).

How many qubits does it take to be interesting?

This seems to depend on whether you’re a purist pursuing a universal quantum computer, or you’re interested in application-specific quantum computing. Jeremy Hilton, VP of processor development at D-Wave Systems, was quoted in EE-Times as saying “We believe that starting with an application-specific quantum processor is the right way to go—as a stepping stone to the Holy Grail—a universal quantum computer, and that's what D-Wave does—we just [do] optimization problems using qubits.”

This is important because D-Wave Systems has a product claiming more than 1,000 qubits; IBM and Intel are heralding having delivered 20 and 17 qubits recently, respectively; and IBM (again) and Google are talking about 50 and 49 qubits, respectively. It seems generally accepted that D-Wave qubits are not the same as the qubits of a universal quantum computer.

At Intel’s recent HPC Developers Conference, Jim Held, Intel Fellow & Director of Emerging Technologies Research at Intel Labs, gave the following sketch for what different sizes of quantum computers are likely to bring:

  • about 50 two-state qubits: proof of concept
    • computational power exceeds supercomputers on some problems
    • learning test bed for quantum systems
  • starting around 1,000 two-state qubits: small problems
    • has only limited error correction
    • most useful for chemistry, materials design, optimization
  • starting around 1,000,000 two-state qubits: commercial scale
    • will offer fault-tolerant operation
    • this is when cryptography by quantum computers is interesting (some cryptographic algorithms we rely on today become relatively easy to break, but not all)
    • useful for machine learning and more

A key reason for needing so many qubits is dealing with their fragility. At 1,000 qubits, there is only limited error correction, but at 1,000,000 the system has fault tolerance, which is a key to why it can become fairly general purpose.

If tens of qubits are hard, how do we get to millions of qubits?

There are many approaches being pursued to tackle this problem. One approach is to reject the notion that we are limited to qubits (which have two states plus superposition) and adopt a bit with more possible states. A recent publication focused on supporting “qudits” that can each assume 10 or more states (maybe we should call it a 10-state qubit). Other approaches look to higher density of qubits, possibly relying on the spin of a single electron. With multiple approaches being researched, how long will it be until we see millions of qubits?

Qubit: a rating for how-powerful-is-your-quantum-computer?

Quantum computing is defined by the common pursuit to harness quantum effects to build a new—and potentially superior—type of computer. I started this article with the concept that a qubit is to quantum computing as a bit is to regular computing. I’ll finish by suggesting that the real significance of a qubit is as a measure of the capabilities of a quantum computer, and quantum computers will change our lives.

Watch for some amazing results along the journey; but for the complete disruption of computing as we know it, we’ll need the equivalent of millions of two-state qubits in a computer that is truly harnessing quantum effects. With so many experts pursuing large-scale quantum computing, and breakthroughs in both density and error correction needed, it’s impossible to say if this will happen in a few years or a few decades. It does seem reasonable to posit that it’s coming, although not everyone agrees. Time will tell.

Some thought-provoking additional reading:

Download your free copy of Intel® Parallel Studio XE here


Copyright © 2017 IDG Communications, Inc.