The use of machine learning to understand and drive discoveries in quantum physics and to control quantum technologies has witnessed a tremendous expansion in recent years. Groundbreaking advances in both the development and deployment of novel algorithms are enabling accelerated discoveries in the theoretical understanding of quantum physical phenomena, but also in our capabilities to harness their exotic properties in concrete experiments. For example, machine learning models have been successfully implemented to achieve new and alternative techniques for quantum state representation that are ushering in new methods to investigate the complexity of quantum systems. Machine learning is also becoming more and more predominant as a tool to harness the potential of quantum tech- nologies and for the optimization of experimental protocols that perform better measurements with fewer resources, improve the quality of low-resolution images, or recognize patterns in noisy data. The power of machine learning algorithms – and in particular neural networks – as universal approximators is also being heavily exploited for the classification of ordered and topological phases of matter. In the so-called “noisy intermediate-scale quantum era”, quantum computers are affected by various detrimental effects, ranging from decoherence to spurious couplings, that limit the potential of such quantum technologies. Machine learning is allowing us to design control strategies to cope with such limitations by implementing quantum gates with higher fidelity, and shorter run times. At the same time, the rapid development of quantum processors and quantum computing is enabling the design and testing of machine learning algorithms that use quantum devices as their fundamental building blocks. These quantum machine learning architectures are posed to harness the power of quantum mechanics to achieve even higher performance. Additionally, there is an ongoing and increasing interdisciplinary cross-fertilization between physicists and machine learning scientists, both in academic and industrial context.
This track will delve into the connections between quantum physics and machine learning. In particular, the track will focus on four sub-sessions, covering:
- Detection and classification of phases and phase transitions.
- Quantum data representation, reconstruction and generation.
- Protocol optimization.
- Denoising and error correction.
These topics address key questions of how machine learning can facilitate the discovery and analysis of new states of matter, the compression of quantum information, the optimization of experiments, and the generation of higher- quality quantum data.
We are currently experiencing a transformative period in which machine learning and quantum physics are de- veloping a symbiotic relationship with each other. On the one hand, the computational power granted by machine learning is helping physicist to tackle previously intractable problems by optimizing the design, control, and information extraction of experiments and by speeding up computationally intensive tasks. On the other hand, theoretical tools from physics are inspiring new approaches in constructing and interpreting machine learning models and their learning dynamics, while the development of near-term quantum devices is the breeding ground for new concepts of machine learning that operate with the laws of quantum mechanics.
Therefore, the objective of this track is to explore how we can learn more about quantum physics by using machine learning as an optimization tool, but also to construct better performing algorithms inspired by the laws of quantum and statistical physics. Some of the concrete questions that we would like to address are: How can we construct efficient algorithms for the rapid and reliable identification of physical properties of quantum systems, eg from experimental measurements? How can we employ machine learning to effectively represent, store, and manipulate the data contained in the exponentially large Hilbert space of quantum systems? How can we inform the design and control of protocols for the extraction and manipulation of quantum data with machine learning? How can we apply machine learning to reliably discern spurious from relevant information to enhance the signal-to-noise ratio of quantum data?