mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-11-02 21:40:15 +08:00
optimizers
This commit is contained in:
@ -50,7 +50,10 @@ and
|
|||||||
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
||||||
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
||||||
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
||||||
|
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
|
||||||
|
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
|
||||||
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
||||||
|
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
|
||||||
|
|
||||||
### Installation
|
### Installation
|
||||||
|
|
||||||
|
|||||||
@ -11,7 +11,10 @@ summary: >
|
|||||||
## Optimizer Implementations
|
## Optimizer Implementations
|
||||||
* [Adam Optimizer](adam.html)
|
* [Adam Optimizer](adam.html)
|
||||||
* [AMSGrad Optimizer](amsgrad.html)
|
* [AMSGrad Optimizer](amsgrad.html)
|
||||||
|
* [Adam Optimizer with warmup](adam_warmup.html)
|
||||||
|
* [Noam Optimizer](noam.html)
|
||||||
* [Rectified Adam Optimizer](radam.html)
|
* [Rectified Adam Optimizer](radam.html)
|
||||||
|
* [AdaBelief Optimizer](adabelief.html)
|
||||||
|
|
||||||
This [MNIST example](mnist_experiment.html) uses these optimizers.
|
This [MNIST example](mnist_experiment.html) uses these optimizers.
|
||||||
|
|
||||||
|
|||||||
@ -49,7 +49,10 @@ and
|
|||||||
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
||||||
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
||||||
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
||||||
|
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
|
||||||
|
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
|
||||||
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
||||||
|
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
|
||||||
|
|
||||||
### Installation
|
### Installation
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user