25 Commits

Author SHA1 Message Date
5eecda7e28 cleanup log activations 2025-07-20 09:10:05 +05:30
a713c92b82 cleanup hook model outputs 2025-07-20 09:02:34 +05:30
5bdedcffec remove labml_helpers dep 2025-07-20 08:56:03 +05:30
1b702523b9 remove labml_helpers dependency: replace Module with nn.Module 2025-07-18 10:32:36 +05:30
391fa39167 cleanup notebooks 2024-06-24 16:17:09 +05:30
9a42ac2697 arxiv.org links 2023-10-24 14:42:32 +01:00
c5685c9ffe remove app.labml.ai links 2023-04-02 12:10:18 +05:30
d4b4c28840 typo fixes 2021-10-19 19:17:51 +05:30
996b58be04 paper links 2021-08-17 14:12:33 +05:30
d0044a88c2 experiment links 2021-08-08 08:35:05 +05:30
e38f9af968 repo name 2021-08-08 08:32:39 +05:30
3acd23cf20 labml app links 2021-02-27 17:58:00 +05:30
c2107755bb labml app links 2021-02-27 17:54:11 +05:30
5442dfb130 compressive transformer links 2021-02-19 14:58:16 +05:30
969df7190d documentation fixes 2021-02-19 14:53:44 +05:30
30bbea4172 colab notebook 2021-02-19 10:39:43 +05:30
6a5df4fc11 typo 2021-02-19 09:31:07 +05:30
661009953c 📚 compressive transformer experiment 2021-02-19 08:53:26 +05:30
a1b1550245 📚 compressive transformer 2021-02-19 08:34:17 +05:30
e5751ab341 📚 compressive transformer docs - work in progress 2021-02-18 14:58:45 +05:30
7904401e61 hyper param 2021-02-17 21:51:05 +05:30
b39ac9ebcd 🐛 typo 2021-02-17 18:35:50 +05:30
9636cfef03 🚧 layer norm in attention rec loss 2021-02-17 17:37:16 +05:30
c1ab9d8589 🚧 compressive transformer fix 2021-02-17 17:29:02 +05:30
ff8f80039a 🚧 compressive transformer 2021-02-17 16:36:52 +05:30