mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-08-26 16:50:39 +08:00
link fix
This commit is contained in:
@ -132,7 +132,7 @@ implementations.</p>
|
||||
<li><a href="https://nn.labml.ai/normalization/instance_norm/index.html">Instance Normalization</a></li>
|
||||
<li><a href="https://nn.labml.ai/normalization/group_norm/index.html">Group Normalization</a></li>
|
||||
<li><a href="https://nn.labml.ai/normalization/weight_standardization/index.html">Weight Standardization</a></li>
|
||||
<li><a href="https://nn.labml.ai/normalization/batch_channel_normalization/index.html">Batch-Channel Normalization</a></li>
|
||||
<li><a href="https://nn.labml.ai/normalization/batch_channel_norm/index.html">Batch-Channel Normalization</a></li>
|
||||
</ul>
|
||||
<h3>Installation</h3>
|
||||
<pre><code class="bash">pip install labml-nn
|
||||
|
@ -78,7 +78,7 @@
|
||||
<li><a href="instance_norm/index.html">Instance Normalization</a></li>
|
||||
<li><a href="group_norm/index.html">Group Normalization</a></li>
|
||||
<li><a href="weight_standardization/index.html">Weight Standardization</a></li>
|
||||
<li><a href="batch_channel_normalization/index.html">Batch-Channel Normalization</a></li>
|
||||
<li><a href="batch_channel_norm/index.html">Batch-Channel Normalization</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
<div class='code'>
|
||||
|
@ -76,7 +76,7 @@
|
||||
<p>This is a <a href="https://pytorch.org">PyTorch</a> implementation of Weight Standardization from the paper
|
||||
<a href="https://arxiv.org/abs/1903.10520">Micro-Batch Training with Batch-Channel Normalization and Weight Standardization</a>.
|
||||
We also have an
|
||||
<a href="https://nn.labml.ai/normalization/batch_channel_normalization/index.html">annotated implementation of Batch-Channel Normalization</a>.</p>
|
||||
<a href="https://nn.labml.ai/normalization/batch_channel_norm/index.html">annotated implementation of Batch-Channel Normalization</a>.</p>
|
||||
</div>
|
||||
<div class='code'>
|
||||
|
||||
|
@ -160,6 +160,13 @@
|
||||
</url>
|
||||
|
||||
|
||||
<url>
|
||||
<loc>https://nn.labml.ai/normalization/weight_standardization/experiment.html</loc>
|
||||
<lastmod>2021-04-28T16:30:00+00:00</lastmod>
|
||||
<priority>1.00</priority>
|
||||
</url>
|
||||
|
||||
|
||||
<url>
|
||||
<loc>https://nn.labml.ai/normalization/weight_standardization/index.html</loc>
|
||||
<lastmod>2021-04-28T16:30:00+00:00</lastmod>
|
||||
@ -167,6 +174,13 @@
|
||||
</url>
|
||||
|
||||
|
||||
<url>
|
||||
<loc>https://nn.labml.ai/normalization/weight_standardization/readme.html</loc>
|
||||
<lastmod>2021-04-28T16:30:00+00:00</lastmod>
|
||||
<priority>1.00</priority>
|
||||
</url>
|
||||
|
||||
|
||||
<url>
|
||||
<loc>https://nn.labml.ai/normalization/weight_standardization/experiment.html</loc>
|
||||
<lastmod>2021-04-28T16:30:00+00:00</lastmod>
|
||||
|
@ -64,7 +64,7 @@ implementations.
|
||||
* [Instance Normalization](https://nn.labml.ai/normalization/instance_norm/index.html)
|
||||
* [Group Normalization](https://nn.labml.ai/normalization/group_norm/index.html)
|
||||
* [Weight Standardization](https://nn.labml.ai/normalization/weight_standardization/index.html)
|
||||
* [Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_normalization/index.html)
|
||||
* [Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_norm/index.html)
|
||||
|
||||
### Installation
|
||||
|
||||
|
@ -12,5 +12,5 @@ summary: >
|
||||
* [Instance Normalization](instance_norm/index.html)
|
||||
* [Group Normalization](group_norm/index.html)
|
||||
* [Weight Standardization](weight_standardization/index.html)
|
||||
* [Batch-Channel Normalization](batch_channel_normalization/index.html)
|
||||
* [Batch-Channel Normalization](batch_channel_norm/index.html)
|
||||
"""
|
||||
|
@ -3,4 +3,4 @@
|
||||
This is a [PyTorch](https://pytorch.org) implementation of Weight Standardization from the paper
|
||||
[Micro-Batch Training with Batch-Channel Normalization and Weight Standardization](https://arxiv.org/abs/1903.10520).
|
||||
We also have an
|
||||
[annotated implementation of Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_normalization/index.html).
|
||||
[annotated implementation of Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_norm/index.html).
|
||||
|
@ -70,7 +70,7 @@ implementations almost weekly.
|
||||
* [Instance Normalization](https://nn.labml.ai/normalization/instance_norm/index.html)
|
||||
* [Group Normalization](https://nn.labml.ai/normalization/group_norm/index.html)
|
||||
* [Weight Standardization](https://nn.labml.ai/normalization/weight_standardization/index.html)
|
||||
* [Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_normalization/index.html)
|
||||
* [Batch-Channel Normalization](https://nn.labml.ai/normalization/batch_channel_norm/index.html)
|
||||
|
||||
### Installation
|
||||
|
||||
|
Reference in New Issue
Block a user