Researchers from the University of Sydney have recently released an algorithm that aims to improve the application and implementation of large-scale quantum computing. The algorithm is designed to characterise what is broadly referred to in the field as noise, which will allow differentiation between accurate quantum states and the surrounding useless data, and enabling the scaling-up of quantum computing application.
Dr Robin Harper from the research team has cited that noise is the central obstacle in place when building large-scale quantum computers, and once noise is solved for or at least mitigated, the efficiency of quantum computing will be largely improved.
Taming the noise has proven difficult during the upscaling of quantum computing principles. Small scale devices are simple enough in comparison to solve for the interference of noise on the systems, but extrapolating to a large scale application requires a more nuanced understanding of the effect of noise on the systems.
The research paper outlines an algorithmic protocol that efficiently and accurately shows the error rates of quantum noise on a large-scale quantum computing system, showing an estimate of effective noise as well as enabling the discovery of long-range two-qubit correlations that have previously remained undetectable before the application of this new noise-identifying algorithm.
“The results are the first implementation of provably rigorous and scalable diagnostic algorithms capable of being run on current quantum devices and beyond.” Dr Harper said of their results, urging that this will likely be the foundation on which to build the next generation of quantum devices.
The importance of finding correlation in data cannot be understated, a fact the developers at Tower know very well. It is fascinating to see how finding correlations in order to sift out bad data, or noise, can be scaled down to even the quantum level.
Read the full paper here: https://arxiv.org/abs/1907.13022