Presentation Information

[2N6-GS-2x-01]Generative Modeling of Graph-Structured Data via Latent Flow Matching

〇Sho Kuno1, Kengo Nakajima2,3 (1. Graduate School of Information Science and Technology, The University of Tokyo, 2. Information Technology Center, The University of Tokyo, 3. RIKEN R-CCS)
[[online]]

Keywords:

Non-Euclidean Generative Modeling,Flow Matching

Graph foundation models require generators that are permutation-consistent, scalable, and useful beyond unconditional sampling. Latent Graph Diffusion (LGD) (Zhou et al., NeurIPS’24) maps graphs to a Euclidean latent space and unifies generation and supervised prediction via conditional completion, but its iterative diffusion sampler is slow. We propose \emph{Latent Graph Flow Matching} (LGFM), replacing LGD's latent diffusion sampler with continuous-time flow matching and enabling sampling via a single ODE solve in latent graph space. To our knowledge, LGFM is the first approach to bring LGD's unified conditional-generation interface to a flow-matching formulation. On QM9 unconditional molecule generation, LGFM achieves $\sim$94\% validity with competitive distributional metrics while improving throughput by $7.8\times$ (35 graphs/s vs.\ 4.5 for LGD-small). We further demonstrate the same conditional-generation interface on molecular property regression by generating masked graph-level labels.