Files
Varuna Jayasiri 19ed54c6e1 radam plot
2020-12-10 16:02:01 +05:30

68 lines
2.5 KiB
Python

"""
[![PiPy Version](https://badge.fury.io/py/labml-nn.svg)](https://badge.fury.io/py/labml-nn)
[![PiPy Downloads](https://pepy.tech/badge/labml-nn)](https://pepy.tech/project/labml-nn)
# [LabML Neural Networks](https://lab-ml.com/labml_nn/index.html)
This is a collection of simple PyTorch implementation of various
neural network architectures and layers.
We will keep adding to this.
## Modules
#### ✨ [Transformers](https://lab-ml.com/labml_nn/transformers)
[Transformers module](https://lab-ml.com/labml_nn/transformers)
contains implementations for
[multi-headed attention](https://lab-ml.com/labml_nn/transformers/mha.html)
and
[relative multi-headed attention](https://lab-ml.com/labml_nn/transformers/relative_mha.html).
* [kNN-LM: Generalization through Memorization](https://lab-ml.com/labml_nn/transformers/knn)
#### ✨ [Recurrent Highway Networks](https://lab-ml.com/labml_nn/recurrent_highway_networks)
#### ✨ [LSTM](https://lab-ml.com/labml_nn/lstm)
#### ✨ [Capsule Networks](https://lab-ml.com/labml_nn/capsule_networks/)
#### ✨ [Generative Adversarial Networks](https://lab-ml.com/labml_nn/gan/)
* [GAN with a multi-layer perceptron](https://lab-ml.com/labml_nn/gan/simple_mnist_experiment.html)
* [GAN with deep convolutional network](https://lab-ml.com/labml_nn/gan/dcgan.html)
* [Cycle GAN](https://lab-ml.com/labml_nn/gan/cycle_gan.html)
#### ✨ [Sketch RNN](https://lab-ml.com/labml_nn/sketch_rnn/)
#### ✨ [Reinforcement Learning](https://lab-ml.com/labml_nn/rl/)
* [Proximal Policy Optimization](https://lab-ml.com/labml_nn/rl/ppo/) with
[Generalized Advantage Estimation](https://lab-ml.com/labml_nn/rl/ppo/gae.html)
* [Deep Q Networks](https://lab-ml.com/labml_nn/rl/dqn/) with
with [Dueling Network](https://lab-ml.com/labml_nn/rl/dqn/model.html),
[Prioritized Replay](https://lab-ml.com/labml_nn/rl/dqn/replay_buffer.html)
and Double Q Network.
#### ✨ [Optimizers](https://lab-ml.com/labml_nn/optimizers/)
* [Adam](https://lab-ml.com/labml_nn/optimizers/adam.html)
* [AMSGrad](https://lab-ml.com/labml_nn/optimizers/amsgrad.html)
* [Rectified Adam Optimizer](https://lab-ml.com/labml_nn/optimizers/radam.html)
### Installation
```bash
pip install labml_nn
```
### Citing LabML
If you use LabML for academic research, please cite the library using the following BibTeX entry.
```bibtex
@misc{labml,
author = {Varuna Jayasiri, Nipun Wijerathne},
title = {LabML: A library to organize machine learning experiments},
year = {2020},
url = {https://lab-ml.com/},
}
```
"""