Speaker
Description
We introduce a machine learning model based on flow matching to overcome the limitations of Monte Carlo (MC) sampling methods. We demonstrate its capability in the 2D XY model, where a single network, trained in configurations from a small (32X32) lattice at only sparse temperature points, can generate high-fidelity samples for both a much larger system (>128X128) and a continuous temperature range without retraining. The generated configurations are in good agreement with key thermodynamic observables and exhibit the Berezinskii-Kosterlitz-Thouless (BKT) transition signatures. This dual generalization is achieved because the flow matching framework learns a continuous, temperature-conditioned mapping, while the inductive biases of our U-Net architecture ensure the learned local physical rules are scale-invariant. By pairing these methods through operator fusion, our approach achieves superior sampling efficiency and computational speed on large lattices compared to highly optimized, GPU-accelerated MCMC algorithms. Our approach establishes a robust method for studying critical phenomena in the thermodynamical limit and can be easily applied in other classical or quantum many-body systems.