Return to Colloquia & Seminar listing
Conditional Divergence Matching for Improving Flow Matching Generative Modeling
Mathematics of Data & DecisionsSpeaker: | Bao Wang |
Location: | 1025 PDSB |
Start time: | Tue, May 6 2025, 3:10PM |
Conditional flow matching (CFM) stands out as an efficient, simulation-free approach for training flow-based generative models, achieving remarkable performance for data generation. However, CFM is insufficient to ensure accuracy in learning probability paths. In this paper, we introduce a new partial differential equation characterization for the error between the learned and exact probability paths, along with its solution. We show that the total variation between probability paths is bounded above by a combination of the CFM loss and an associated divergence loss. This theoretical insight leads to the design of a new objective function that simultaneously matches the flow and its divergence. Our new approach improves the performance of the flow-based generative model by a noticeable margin without sacrificing generation efficiency.