mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-12-11 10:57:24 +08:00
link to jax transformer
This commit is contained in:
@@ -79,7 +79,8 @@
|
||||
<h3><strong><a href="https://nn.labml.ai/ja/">Japanese (translated)</a></strong></h3>
|
||||
<h2>Paper Implementations</h2>
|
||||
<h4>✨ <a href="transformers/index.html">Transformers</a></h4>
|
||||
<ul><li><a href="transformers/mha.html">Multi-headed attention</a> </li>
|
||||
<ul><li><a href="transformers/jax_transformer/index.html">JAX implementation</a> </li>
|
||||
<li><a href="transformers/mha.html">Multi-headed attention</a> </li>
|
||||
<li><a href="transformers/flash/index.html">Triton Flash Attention</a> </li>
|
||||
<li><a href="transformers/models.html">Transformer building blocks</a> </li>
|
||||
<li><a href="transformers/xl/index.html">Transformer XL</a> </li>
|
||||
|
||||
@@ -24,6 +24,7 @@ implementations.
|
||||
|
||||
#### ✨ [Transformers](transformers/index.html)
|
||||
|
||||
* [JAX implementation](transformers/jax_transformer/index.html)
|
||||
* [Multi-headed attention](transformers/mha.html)
|
||||
* [Triton Flash Attention](transformers/flash/index.html)
|
||||
* [Transformer building blocks](transformers/models.html)
|
||||
|
||||
@@ -20,6 +20,7 @@ implementations almost weekly.
|
||||
|
||||
#### ✨ [Transformers](https://nn.labml.ai/transformers/index.html)
|
||||
|
||||
* [JAX implementation](https://nn.labml.ai/transformers/jax_transformer/index.html)
|
||||
* [Multi-headed attention](https://nn.labml.ai/transformers/mha.html)
|
||||
* [Triton Flash Attention](https://nn.labml.ai/transformers/flash/index.html)
|
||||
* [Transformer building blocks](https://nn.labml.ai/transformers/models.html)
|
||||
|
||||
Reference in New Issue
Block a user