mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-11-03 05:46:16 +08:00
optimizers
This commit is contained in:
@ -50,7 +50,10 @@ and
|
||||
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
||||
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
||||
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
||||
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
|
||||
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
|
||||
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
||||
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
|
||||
|
||||
### Installation
|
||||
|
||||
|
||||
@ -11,7 +11,10 @@ summary: >
|
||||
## Optimizer Implementations
|
||||
* [Adam Optimizer](adam.html)
|
||||
* [AMSGrad Optimizer](amsgrad.html)
|
||||
* [Adam Optimizer with warmup](adam_warmup.html)
|
||||
* [Noam Optimizer](noam.html)
|
||||
* [Rectified Adam Optimizer](radam.html)
|
||||
* [AdaBelief Optimizer](adabelief.html)
|
||||
|
||||
This [MNIST example](mnist_experiment.html) uses these optimizers.
|
||||
|
||||
|
||||
@ -49,7 +49,10 @@ and
|
||||
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
|
||||
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
|
||||
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
|
||||
* [Adam Optimizer with warmup](https://lab-ml.com/labml_nn/optimizers/adam_warmup.html)
|
||||
* [Noam Optimizer](https://lab-ml.com/labml_nn/optimizers/noam.html)
|
||||
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
|
||||
* [AdaBelief Optimizer](https://lab-ml.com/labml_nn/optimizers/adabelief.html)
|
||||
|
||||
### Installation
|
||||
|
||||
|
||||
Reference in New Issue
Block a user