Stochastic Regularized Majorization-Minimization (SRMM)
Hanbaek Lyu,
"Stochastic regularized block majorization-minimization with weakly convex and multi-convex surrogates" (arXiv 2023)
Stochastic majorization-minimization (SMM) is a class of stochastic optimization algorithms that proceed by sampling new data points and minimizing a recursive average of surrogate functions of an objective function. We propose an extension of SMM called Stochastic Reguarlized Majorizaiton-Minimizaiton (SRMM) where surrogates are allowed to be only weakly convex or block multi-convex, and the averaged surrogates are approximately minimized with proximal regularization or block-minimized within diminishing radii, respectively.
In this repository, we provide a special version of SRMM proposed in the reference below, where prox-linear surrogates with proximal regularization is used. The resulting algorithm is equivalent to the following iterates
Setting
- src.SRMM.py : main algorithm source file that implements the SRMM algorithm in (23)
- demos.cifar10: contains scripts for generating the DenseNet and ResNet training/testing accuracies figure
- Hanbaek Lyu - Initial work - Website
This project is licensed under the MIT License - see the LICENSE.md file for details