Skip to content

Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️

License

Notifications You must be signed in to change notification settings

OptimalFoundation/awesome-optimizers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Awesome Optimizers

This repository is concieved to provide aid in literature reiviews to Optimization researchers by offering an up-to-date list of literature and corresponding summaries.

If this repository has been useful to you in your research, please cite it using the cite this repository option available in Github. This repository would not have been possible without these open-source contributors. Thanks! 💖

Table of Contents

Legend

Symbol Meaning
📤 Summary
💻 Code

Survey Papers

First-order Optimizers

Adaptive Optimizers

Adam Family of Optimizers

Second-order Optimizers

Other Optimisation-Related Research

General Improvements

Optimizer Analysis and Meta-research

Hyperparameter Tuning

About

Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published