link to jax transformer

This commit is contained in:
Varuna Jayasiri
2025-11-11 09:22:38 +00:00
parent c3d868baa1
commit 25e169843e
3 changed files with 4 additions and 1 deletions

View File

@@ -79,7 +79,8 @@
<h3><strong><a href="https://nn.labml.ai/ja/">Japanese (translated)</a></strong></h3>
<h2>Paper Implementations</h2>
<h4><a href="transformers/index.html">Transformers</a></h4>
<ul><li><a href="transformers/mha.html">Multi-headed attention</a> </li>
<ul><li><a href="transformers/jax_transformer/index.html">JAX implementation</a> </li>
<li><a href="transformers/mha.html">Multi-headed attention</a> </li>
<li><a href="transformers/flash/index.html">Triton Flash Attention</a> </li>
<li><a href="transformers/models.html">Transformer building blocks</a> </li>
<li><a href="transformers/xl/index.html">Transformer XL</a> </li>

View File

@@ -24,6 +24,7 @@ implementations.
#### ✨ [Transformers](transformers/index.html)
* [JAX implementation](transformers/jax_transformer/index.html)
* [Multi-headed attention](transformers/mha.html)
* [Triton Flash Attention](transformers/flash/index.html)
* [Transformer building blocks](transformers/models.html)

View File

@@ -20,6 +20,7 @@ implementations almost weekly.
#### ✨ [Transformers](https://nn.labml.ai/transformers/index.html)
* [JAX implementation](https://nn.labml.ai/transformers/jax_transformer/index.html)
* [Multi-headed attention](https://nn.labml.ai/transformers/mha.html)
* [Triton Flash Attention](https://nn.labml.ai/transformers/flash/index.html)
* [Transformer building blocks](https://nn.labml.ai/transformers/models.html)