By Hamish Johnston
…and how do we build one?
The two physicists — based at the University of Sheffield — have proposed an updated version of David Di Vincenzo’s checklist for what makes a system suitable for quantum computing.
According to Di Vincenzo it must:
1. Be a scalable physical system with well-defined qubits
2. Be initializable to a simple fiducial state such as |000…>
3. Have decoherence times much longer than gate operation times
4. Have a universal set of quantum gates
5. Permit high quantum efficiency, qubit-specific measurements
6. Have the ability to interconvert stationary and flying qubits
7. Have the ability to faithfully transmit flying qubits between specific locations
The first five were proposed in 1996 and then updated in 2000 to include the distinction between stationary and “flying” qubits — the latter referring to a photon or other such particle that can transfer quantum information.
In their paper, Perez-Delgado and Kok argue that the above criteria are not general enough to evaluate the various “paradigms” for quantum computing that have emerged since 2000.
They suggest the following criteria that must be met to create a “scalable and fault-tolerant quantum computer”.
1. Any quantum computer must have a quantum memory.
2. Any quantum computer must facilitate a controlled quantum evolution of the quantum memory.
3. Any quantum computer must include a method for cooling the quantum memory.
4. Any quantum computer must provide a readout mechanism for (non-empty) subsets of the quantum memory.
1, 2 and 4 seem reasonable — but what do they mean by “cooling”?
By cooling they mean the removal of entropy (or randomness) in the context of information theory.
Entropy will leak into a quantum memory as the memory interacts in unwanted and uncontrollable ways with its surroundings. Also, entropy is generated when a quantum memory is “erased” so that the next computation can begin.
Although this cooling could be split into “error correction” and “initialization” respectively, they argue that there is a certain “fuzziness” between the two processes. I believe this is because initialization can often be a multi-step process that must involve error correction.
I’m not a quantum-computing expert, but I’m guessing that criterion 3 will be the most difficult to satisfy…