Research Projects
Convergence Guarantees of Diffusion Models
Joint work with Kexin Fu and Farzan Farnia. Oct. 2023 - Present.
Preprint: On the Mode-Seeking Properties of Langevin Dynamics
- Formulated high-dimensional multi-modal distributions with an inter-mode component
- Theoretically studied the mode-seeking tendencies of Langevin dynamics
- Proposed Chained Langevin Dynamics and provided a provable convergence guarantee
- Validated the theoretical analysis through numerical experiments [github]
Stability and Generalization of Min-Max Optimization
Joint work with Kexin Fu and Farzan Farnia. Sep. 2022 - Oct. 2023.
Paper: Stability and Generalization in Free Adversarial Training
Journal: Transactions on Machine Learning Research (TMLR), 2024.
- Analyzed the algorithmic stability of free, fast, and vanilla adversarial training algorithms
- Provided theoretical comparisons of the generalization properties of the algorithms
- Proposed Free-TRADES with improved generalization performance
- Validated the theoretical analysis through numerical experiments [github]
Learning and Testing Dependent Discrete Distributions
Joint work with Haitong Liu and Siu On Chan. June 2022 - May 2023.
Graduation Project Report: On the Lower Bounds for Learning and Testing Markov Chains
- Analyzed the problem of learning and testing dependent distributions from Markovian trajectories
- Proved a lower bound on the sample complexity of learning Markov chains
- Demonstrated the order-wise optimality of the k-cover-time algorithm
Iterative Methods for Non-Convex Optimization
Joint work with Sijin Chen and Anthony Man-Cho So. June 2021 - Jan. 2022.
Paper: Non-Convex Joint Community Detection and Group Synchronization via Generalized Power Method
Conference: International Conference on Artificial Intelligence and Statistics (AISTATS), 2024.
- Analyzed the non-convex optimization problem of joint group synchronization and community detection
- Proposed a generalized power method (GPM) with spectral initialization
- Established the linear convergence guarantee for GPM and an error bound for spectral initialization
- Demonstrated its significantly lower complexity than the previous semidefinite relaxation methods
Information-Theoretical Analysis of Group Testing
Joint work with Sidharth Jaggi and Qiaoqiao Zhou. June 2020 - March 2022.
Paper: Generalized Group Testing
Conference: International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
Journal: IEEE Transactions on Information Theory (TIT), March 2023.
- Unified different group testing problem settings including noisy and threshold group testing
- Proposed a non-adaptive probabilistic testing scheme with provable recovery guarantees
- Established novel information-theoretical lower bounds on test complexity
- Proved the order-wise optimality of the proposed scheme under noisy settings