Score-based generative models (SGMs), which include diffusion models and flow matching, have had a transformative impact on the field of generative modeling. In a nutshell, the key idea is that by taking the time-reversal of a forward ergodic diffusion process initiated at the data distribution, one can "generate data from noise." In practice, SGMs learn an approximation of the score function of the forward process and employ it to construct an Euler scheme for its time reversal.
In this talk, I will present the main ideas of a general strategy that combines insights from stochastic control and entropic optimal transport to bound the error in SGMs. That is, to bound the distance between the algorithm's output and the target distribution. A nice feature of this approach is its robustness: indeed, it can be used to analyse SGMs built upon noising dynamics that are different from the Ornstein-Uhlenbeck process . As an example, I will illustrate how to obtain error bounds for SGMs on the hypercube.