Publication
APS March Meeting 2024
Talk

Demonstration of a Quantum Machine Learning Algorithm beyond the Coherence Time

Abstract

The practical implementation of many quantum algorithms known today is believed to be limited by the coherence time of the executing quantum system and quantum sampling noise. Practical implementation of known quantum algorithms beyond the coherence time of the constituent physical qubits necessitates Quantum Error Correction. Here we introduce a machine learning algorithm, NISQRC, that can be trained to reliably produce inference on a signal of arbitrary length, not limited by the depth of the circuit. NISQRC achieves this by employing spatio-temporal encoding for the classical data of arbitrary length and striking a balance between mid-circuit measurements and temporal encoding to endow the quantum system with an appropriate-length fading memory to capture the time-domain correlations in the streaming data. Because NISQRC can be executed through a fast, convex classical optimization scheme, this methodology is not subject to difficulties arising from barren plateaus. The practicability of our approach is demonstrated by experimentally implementing an online machine learning task on time-dependent signal generated by the signal generator at the room temperature that is input in a period longer than the average T_1 time of the qubits composing the circuit, only limited by the classical memory buffer of the signal acquisition and processing system. We provide a performance analysis comparing to the state-of-the-art. This research was developed with funding from the DARPA contract HR00112190072, AFOSR award FA9550-20-1-0177, and AFOSR MURI award FA9550-22-1-0203. The views, opinions, and findings expressed are solely the authors and not the U.S. government. The authors acknowledge the use of IBM Quantum services for this work.