Quantum computers, once they become a reality, are expected to provide the capabilities to perform parallel computations on a massive scale. They are subject of intense research in two directions. The first focus is in experimental physics, whereby scientists are building hardware systems in which qubits—the data storage units, such as electrons or atoms, can assume two quantum states simultaneously and perform computations. The second focus involves a theoretical component, where researchers develop algorithms to be the basis of software that will allow the use of quantum computers. The latter approach was the subject of the EU-funded project, called QCS, which stands for Quantum Computer Science, which was funded under the FET scheme and completed in August 2013. The project focused on the mathematical aspects of quantum computing. Andris Ambainis, a theoretical computer scientist at the University of Latvia in Riga, and the principal coordinator of the project, talks to youris.com about the future of quantum computing. What are the most important results of this collaboration? One achievement is the development of a new method for devising algorithms for quantum computers. We then have used these algorithms to solve some long-standing open problems in traditional computer science, which have existed since the late 1980s’. Our main tool is mathematical reasoning about quantum computers. We first devise quantum algorithms. And then, we use mathematical proofs to prove that these algorithms would work the right way. Are you sure that these algorithms will really work on future quantum computers? They certainly will. I am hearing about the progress that is happening in laboratories—people are capable of doing quantum operations with a [hardware] precision of 99.9 percent and more. That is a significant advance from 15 or 20 years ago, when we had a precision of 90% or 95%. Now, we can start doing bigger quantum computations on quantum computers with bigger numbers of qubits. Reaching 99.9% accuracy gets us to a level where we can use so-called error correction codes—the encoding of quantum information in such a way that it is protected against errors—to improve the precision further. Below a precision of about 99% error correction codes will not work. Factoring, that is, finding the two prime numbers whose product make up the public encryption keys used by banks in secured communications, is now virtually impossible because of the size of the encryption keys. Quantum computers would do the job easily, but how many qubits would you need? Probably a few thousands. The typical length of an encryption key is about 1,000 bits. To factor this number, a quantum computer would need at least twice this amount of qubits to allow for the storing the intermediate results. So 2,000 qubits would be enough, maybe a few thousand more. Are quantum computers going to revolutionise encrypted communications in the banking world? Yes, they will have to change everything. They will require changing encryption to one of the several varieties of systems that are not based on prime numbers. There are encryption systems based on error correction codes, or on the difficulty of solving polynomial equations—that is equations with positive variables with integer exponents and only relying on the operations of addition, subtraction, and multiplication. There are people working in this field, which is called post-quantum cryptography. One of these systems is called McEliece encryption. It is based on error correction codes which have been around since the 1980s. Now this system is really starting to become practical.by Alexander Hellemans youris.com provides its content to all media free of charge. We would appreciate if you could acknowledge youris.com as the source of the content.