Publications
Home
Schools
Computational Sciences
Publications
- Title
- Mirror Descent of Hopfield Model
- KIAS Author
- Jo, Junghyo
- Journal
- NEURAL COMPUTATION, 2023
- Archive
-
- Abstract
- Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent. While originally developed for convex optimization, it has increasingly been applied in the field of machine learning. In this study, we propose a novel approach for using mirror descent to initialize the parameters of neural networks. Specifically, we demonstrate that by using the Hopfield model as a prototype for neural networks, mirror descent can effectively train the model with significantly improved performance compared to traditional gradient descent methods that rely on random parameter initialization. Our findings highlight the potential of mirror descent as a promising initialization technique for enhancing the optimization of machine learning models.