Seminars
Home
Centers & Programs
AI and Natural Sciences
Seminars
- FIELD
- AI and Natural Sciences
- DATE
-
Oct 07 (Mon), 2024
- TIME
- 10:00 ~ 12:00
- PLACE
- ONLINE
- SPEAKER
- Jongha Ryu
- HOST
- Yoon, Sangwoong
- INSTITUTE
- MIT
- TITLE
- Operator SVD with Neural Networks via Nested Low-Rank Approximation
- ABSTRACT
- Spectral decomposition, eigenvalue decomposition (EVD) or singular value decomposition (SVD), is fundamental to a wide range of problems across various fields. In statistics and machine learning, many correlation analysis and representation learning methods can be understood as the spectral decomposition of an operator that captures correlations or dependencies between random vectors. In scientific computing, solving certain classes of partial differential equations (PDEs), such as the time-independent Schrödinger equation, often reduces to finding the leading eigenfunctions of a differential operator. In this talk, I will introduce a parametric learning framework for spectral decomposition and present an efficient optimization method for training neural networks to learn the leading singular functions of a given linear operator. The proposed method is built on two key concepts: (1) low-rank approximation for subspace learning and (2) a technique called nesting for learning ordered singular functions. We demonstrate the efficacy of the proposed method in solving simple PDEs and structured representation learning tasks. When restricted to finite-dimensional matrices, the proposed framework results in new streaming algorithms for principal component analysis and canonical correlation analysis, which are significantly simpler and more computationally efficient than existing algorithms in the literature. This work was presented in ICML'24 and can be found online: https://openreview.net/forum?id=qESG5HaaoJ.
- FILE
-