diff --git a/docs/index.html b/docs/index.html
index f859361e..303fdad5 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -79,7 +79,8 @@
Paper Implementations
-- Multi-headed attention
+- JAX implementation
+- Multi-headed attention
- Triton Flash Attention
- Transformer building blocks
- Transformer XL
diff --git a/labml_nn/__init__.py b/labml_nn/__init__.py
index ad109003..dcd3ca77 100644
--- a/labml_nn/__init__.py
+++ b/labml_nn/__init__.py
@@ -24,6 +24,7 @@ implementations.
#### ✨ [Transformers](transformers/index.html)
+* [JAX implementation](transformers/jax_transformer/index.html)
* [Multi-headed attention](transformers/mha.html)
* [Triton Flash Attention](transformers/flash/index.html)
* [Transformer building blocks](transformers/models.html)
diff --git a/readme.md b/readme.md
index d4bbc4f9..27ccccef 100644
--- a/readme.md
+++ b/readme.md
@@ -20,6 +20,7 @@ implementations almost weekly.
#### ✨ [Transformers](https://nn.labml.ai/transformers/index.html)
+* [JAX implementation](https://nn.labml.ai/transformers/jax_transformer/index.html)
* [Multi-headed attention](https://nn.labml.ai/transformers/mha.html)
* [Triton Flash Attention](https://nn.labml.ai/transformers/flash/index.html)
* [Transformer building blocks](https://nn.labml.ai/transformers/models.html)