As we are marching towards an age of true quantum supremacy, where highly complex tasks can be completed in mere nanoseconds by supercomputers, but scientists are trying their best to get a grasp of that future.

As is the case with most scientific / engineering subjects, simulations can come in extremely handy.

A team of physicists of the École Polytechnique Fédérale de Lausanne (EPFL) from Switzerland and the Columbia University in the US worked on the way to model the potential of near-term quantum devices – by simulating the quantum mechanics they depend upon on more standardized hardware.

The study relied on a neural network made by EBFL’s Giuseppe Carleo and colleague Matthias Troyer five years ago, which used machine learning to develop an approximation of a quantum system that had to run a particular process.

The algorithm was dubbed as the Quantum Approximate Optimization Algorithm (QAOA), and the process comes as a means of identifying an optimal solution to a problem on energy states out of various possibilities, solutions that should lead to the least errors when utilized.

“There is a lot of interest in understanding what problems can be solved efficiently by a quantum computer, and QAOA is one of the more prominent candidates,” said Carleo.

The result was only an approximation of how long the algorithm could work on a real quantum computer, as it did an amazing job at mimicking the real deal.

Quantum processors rely on units of calculations known as qubits. Qubits are analogues of waves of probabilities, meaning that they don’t have a single defined state. Instead, they can be modelled with a straightforward equation.

If plenty of qubits are linked together in a process called entanglement, the output equation becomes significantly more complex.

When the number of linked qubits increases, the possibilities are nearly limitless when you compare them to regular bits of binary code.

“But the barrier of ‘quantum speedup’ is all but rigid and it is being continuously reshaped by new research, also thanks to the progress in the development of more efficient classical algorithms,” added Carleo.