optimizers

This commit is contained in:
Varuna Jayasiri
2020-12-14 09:33:07 +05:30
parent 5b1897b792
commit ef922321be
3 changed files with 9 additions and 0 deletions

View File

@ -50,7 +50,10 @@ and
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/) #### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html) * [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html) * [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html) * [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
### Installation ### Installation

View File

@ -11,7 +11,10 @@ summary: >
## Optimizer Implementations ## Optimizer Implementations
* [Adam Optimizer](adam.html) * [Adam Optimizer](adam.html)
* [AMSGrad Optimizer](amsgrad.html) * [AMSGrad Optimizer](amsgrad.html)
* [Adam Optimizer with warmup](adam_warmup.html)
* [Noam Optimizer](noam.html)
* [Rectified Adam Optimizer](radam.html) * [Rectified Adam Optimizer](radam.html)
* [AdaBelief Optimizer](adabelief.html)
This [MNIST example](mnist_experiment.html) uses these optimizers. This [MNIST example](mnist_experiment.html) uses these optimizers.

View File

@ -49,7 +49,10 @@ and
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/) #### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html) * [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html) * [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html) * [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
### Installation ### Installation