Attention-based quantum tomography

Peter Cha, Paul Ginsparg, Felix Wu, Juan Carrasquilla, Peter L. McMahon, Eun Ah Kim

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

With rapid progress across platforms for quantum systems, the problem of many-body quantum state reconstruction for noisy quantum states becomes an important challenge. There has been a growing interest in approaching the problem of quantum state reconstruction using generative neural network models. Here we propose the ‘attention-based quantum tomography’ (AQT), a quantum state reconstruction using an attention mechanism-based generative network that learns the mixed state density matrix of a noisy quantum state. AQT is based on the model proposed in ‘Attention is all you need’ by Vaswani et al (2017 NIPS) that is designed to learn long-range correlations in natural language sentences and thereby outperform previous natural language processing (NLP) models. We demonstrate not only that AQT outperforms earlier neural-network-based quantum state reconstruction on identical tasks but that AQT can accurately reconstruct the density matrix associated with a noisy quantum state experimentally realized in an IBMQ quantum computer. We speculate the success of the AQT stems from its ability to model quantum entanglement across the entire quantum system much as the attention model for NLP captures the correlations among words in a sentence.

Original languageEnglish
Article number01LT01
JournalMachine Learning: Science and Technology
Volume3
Issue number1
DOIs
StatePublished - Mar 2022

Keywords

  • Deep learning
  • IBMQ
  • Quantum state tomography
  • Transformer

Fingerprint

Dive into the research topics of 'Attention-based quantum tomography'. Together they form a unique fingerprint.

Cite this