Abstract:
The essence of scientific research is about creation. Generative artificial intelligence (AI) has opened up an endless space of imagination for more creative scientific research. As the core of generative AI, generative models learn the underlying probability distribution from data, and then randomly sample from it to generate new samples. Generative models and statistical physics are essentially two sides of the same coin. This article introduces modern generative models including diffusion models, autoregressive models, flow models, and variational autoencoders from a physics perspective. Generative models demonstrate tremendous potential in the generation and design of materials at the atomic scale. Moreover, based on their inherent connection with statistical physics, they have a unique advantage in optimizing Nature’s cost function, the variational free energy, thereby providing new possibilities for solving difficult problems in statistical physics and quantum many-body systems. At the same time, physical insights are also driving the development and innovation of generative models. By drawing inspiration from physical principles and methods, more efficient and unified generative models can be designed to address challenges in the field of AI.