Publications

You can also find my articles on my Google Scholar profile.

Conference Papers


Constrained Diffusion Models via Dual Training

Published in NeurIPS, 2024

Diffusion models are prone to generating biased data based on the training dataset. To address this issue, we develop constrained diffusion models by imposing diffusion constraints based on desired distributions that are informed by requirements. We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints. To train constrained diffusion models, we develop a dual training algorithm and characterize the optimality of the trained constrained diffusion model. In experiments these models effectively ensure fair sampling from underrepresented classes and prevent overfitting when adapting to new datasets.

Download Paper

Neural Tangent Kernels Motivate Cross-Covariance Graphs in Neural Networks

Published in ICML, 2024

Neural tangent kernels (NTKs) provide a theoretical regime to analyze the learning and generalization behavior of over-parametrized neural networks. By leveraging the structure of NTKs for graph neural networks (GNNs), we theoretically investigate NTKs and alignment, where our analysis reveals that optimizing the alignment translates to optimizing the graph representation in a GNN. Our results further establish theoretical guarantees on the optimality of the alignment for a two-layer GNN and these guarantees are characterized by the graph representation being a function of the cross-covariance between the input and the output data. The theoretical insights drawn from the analysis of NTKs are validated by our experiments focused on a multi-variate time series prediction task.

Download Paper