github link

This commit is contained in:
Varuna Jayasiri
2020-09-04 16:01:21 +05:30
parent 3e240cfe1c
commit 299cd650cd
4 changed files with 7 additions and 0 deletions

1
.gitignore vendored
View File

@ -7,3 +7,4 @@ dist/
build/ build/
.idea/* .idea/*
!.idea/dictionaries !.idea/dictionaries
html/

View File

@ -1,4 +1,6 @@
""" """
<a class="github-button" href="https://github.com/lab-ml/labml_nn" data-size="large" data-show-count="true" aria-label="Star lab-ml/labml_nn on GitHub">Star</a>
# Transformers # Transformers
* [Multi-head attention](mha.html) * [Multi-head attention](mha.html)

View File

@ -1,4 +1,6 @@
""" """
<a class="github-button" href="https://github.com/lab-ml/labml_nn" data-size="large" data-show-count="true" aria-label="Star lab-ml/labml_nn on GitHub">Star</a>
# Multi-Headed Attention # Multi-Headed Attention
The implementation is inspired from [Annotated Transformer](https://nlp.seas.harvard.edu/2018/04/03/attention.html) The implementation is inspired from [Annotated Transformer](https://nlp.seas.harvard.edu/2018/04/03/attention.html)

View File

@ -1,4 +1,6 @@
""" """
<a class="github-button" href="https://github.com/lab-ml/labml_nn" data-size="large" data-show-count="true" aria-label="Star lab-ml/labml_nn on GitHub">Star</a>
# Relative Multi-head Attention # Relative Multi-head Attention
This is an implementation of This is an implementation of