What will the new quantum computer programming logic look like?


There is a lot of talk about quantum computers with high performance and processing power. Unlike quantum bits, the qubits of quantum computers work with superposition, they can assume three distinct values: 0 or 1 or 0 and 1 simultaneously.

I know this is an extensive question, but I would like just a basic explanation on the subject. Since I read something about it, however, because it was a complex subject, it was not clear.

What will be the new programming logic for these computers?

Will classic codes be reused?

asked by anonymous 20.10.2015 / 18:40

2 answers


The difference will not occur (at least for a considerable amount of time) in the coding of deadly programming nodes.

As we live in a physical and non-quantum world, our applications are most often done to solve physical problems. We currently use bit-based computing, and each bit can only have a physical state (0 | 1), called the classical state. We have learned to transform these bits into representable elements with human meanings as numbers.

Quantum computing radically changes this concept, because as a qubit is in the quantum state and there are infinite quantum states, there is no way to represent these states in the classical model without doing a measurement . Anyone who has read or studied quantum mechanics knows that in doing so it changes the state to the value measured with a probability margin of other states. It is on this margin that quantum computing works and it is in them that the algorithms will be used.

With this in mind, we know that computers that work with classic states will continue to be efficient for comparison and summation operations.

Now for probabilistic operations, quantum computing will be the key. A practical example of this is weather forecasting. Current algorithms (called mathematical models) do probabilistic calculations using the neural network concept, dividing in steps and calculating the probability of each step, always choosing the most likely one. To know other model results, you need to change the initial values indicating which path you want to follow. In the quantum model this would occur simultaneously, ie all (assuming there are enough qubits) the probabilities would be calculated at the same time.

With this in mind I believe that we will not radically change the way we program, but rather how we will get the results, unless our mathematics changes (evolves) to the point where we need new concepts of code thinking.

For those who are reading this without knowing anything about it, there is already a commercial quantum computer called D-Wave and just like it was with computers at the beginning of the computer age, it is being used more for scientific than commercial purposes, but soon it may change.

To learn more about quantum computing indico this reading .

20.10.2015 / 20:25

The logic of a quantum computer is completely different!

First, every function used in a quantum computer must be reversible, ie with the output it must be possible to regenerate the input. Most conventional computer algorithms have already broken this rule. For example the addition is not reversible! If you add 2 + 7 it will give 9, but it is impossible to do a function that takes 9 and returns 2 + 7 because some of the information was lost in the process.

So conventional addition, subtraction, multiplication, and division operations are not possible on a quantum computer. With this restriction it is also realized that any quantum function must have the same number of input bits and output bits. This is not a limitation, since reversible methods exist for all computable functions, but it requires a different way of thinking about solving a problem.

Another very different property is that in the middle of a quantum program no bit can be modified, copied or deleted. Such an operation would "ruin" the bits! Moreover, in a quantum function there can be no loops, no ifs, or any kind of conventional flow control: one operation runs after another sequentially. And quantum operations change all bits at once in a way that would require many normal operations to do so by forming a new set of basic instructions that is much more efficient for some types of problems.

If the quantum computer starts to be viable, the most accepted idea is that programming will continue in the same way, but with quantum functions! A conventional computer will write the input bits into the memory of a quantum computer and give a signal for it to start the function. Once the result is calculated, just read the output values. I imagine there will be a library of quantum functions to use in the code and this should be transparent to the programmer.

The quantum computer is extremely fast to make a discrete Fourier transform by solving it with complexity O (n ^ 2). In a conventional computer this operation has complexity O (n * 2 ^ n), much greater. Hence, in 1994 a mathematician named Peter Shor made a quantum algorithm with complexity O (n ^ 3) to factorize a number using the Fourier quantum transform. After that, the interest in quantum computers has grown a lot, because factoring a number with this speed would break many of today's cryptographic systems.

I tried to speak the most important, it is very difficult to explain this trying to be as basic as possible! A more detailed discussion can easily come in deep existential philosophy and the strangeness of the universe, which would not help much to answer the question :)

03.05.2016 / 14:47