A study led by the University of Oxford has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the ‘reality gap’: the difference between predicted and observed behavior from quantum devices. The results have been published in Physical Review X.
Quantum computing could supercharge a wealth of applications, from climate modeling and financial forecasting, to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual quantum devices (also called qubits). A major barrier against this is inherent variability: where even apparently identical units exhibit different behaviors.
The cause of variability in quantum devices.
Comments are closed.