mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-08-26 08:41:23 +08:00
fix links
This commit is contained in:
@ -316,7 +316,7 @@ but modified for MNIST images.</p>
|
|||||||
<div class='section-link'>
|
<div class='section-link'>
|
||||||
<a href='#section-19'>#</a>
|
<a href='#section-19'>#</a>
|
||||||
</div>
|
</div>
|
||||||
<p>We import the [simple gan experiment]((simple_mnist_experiment.html) and change the
|
<p>We import the <a href="../original/experiment.html">simple gan experiment</a> and change the
|
||||||
generator and discriminator networks</p>
|
generator and discriminator networks</p>
|
||||||
</div>
|
</div>
|
||||||
<div class='code'>
|
<div class='code'>
|
||||||
|
@ -86,7 +86,7 @@ function $V(G, D)$.</p>
|
|||||||
<p>$p_{data}(\pmb{x})$ is the probability distribution over data,
|
<p>$p_{data}(\pmb{x})$ is the probability distribution over data,
|
||||||
whilst $p_{\pmb{z}}(\pmb{z})$ probability distribution of $\pmb{z}$, which is set to
|
whilst $p_{\pmb{z}}(\pmb{z})$ probability distribution of $\pmb{z}$, which is set to
|
||||||
gaussian noise.</p>
|
gaussian noise.</p>
|
||||||
<p>This file defines the loss functions. <a href="../simple_mnist_experiment.html">Here</a> is an MNIST example
|
<p>This file defines the loss functions. <a href="experiment.html">Here</a> is an MNIST example
|
||||||
with two multilayer perceptron for the generator and discriminator.</p>
|
with two multilayer perceptron for the generator and discriminator.</p>
|
||||||
</div>
|
</div>
|
||||||
<div class='code'>
|
<div class='code'>
|
||||||
|
@ -393,7 +393,7 @@
|
|||||||
|
|
||||||
<url>
|
<url>
|
||||||
<loc>https://nn.labml.ai/optimizers/adam.html</loc>
|
<loc>https://nn.labml.ai/optimizers/adam.html</loc>
|
||||||
<lastmod>2021-08-17T16:30:00+00:00</lastmod>
|
<lastmod>2021-08-21T16:30:00+00:00</lastmod>
|
||||||
<priority>1.00</priority>
|
<priority>1.00</priority>
|
||||||
</url>
|
</url>
|
||||||
|
|
||||||
|
@ -99,7 +99,7 @@ def _weights_init(m):
|
|||||||
nn.init.constant_(m.bias.data, 0)
|
nn.init.constant_(m.bias.data, 0)
|
||||||
|
|
||||||
|
|
||||||
# We import the [simple gan experiment]((simple_mnist_experiment.html) and change the
|
# We import the [simple gan experiment](../original/experiment.html) and change the
|
||||||
# generator and discriminator networks
|
# generator and discriminator networks
|
||||||
calculate(Configs.generator, 'cnn', lambda c: Generator().to(c.device))
|
calculate(Configs.generator, 'cnn', lambda c: Generator().to(c.device))
|
||||||
calculate(Configs.discriminator, 'cnn', lambda c: Discriminator().to(c.device))
|
calculate(Configs.discriminator, 'cnn', lambda c: Discriminator().to(c.device))
|
||||||
|
@ -27,7 +27,7 @@ $p_{data}(\pmb{x})$ is the probability distribution over data,
|
|||||||
whilst $p_{\pmb{z}}(\pmb{z})$ probability distribution of $\pmb{z}$, which is set to
|
whilst $p_{\pmb{z}}(\pmb{z})$ probability distribution of $\pmb{z}$, which is set to
|
||||||
gaussian noise.
|
gaussian noise.
|
||||||
|
|
||||||
This file defines the loss functions. [Here](../simple_mnist_experiment.html) is an MNIST example
|
This file defines the loss functions. [Here](experiment.html) is an MNIST example
|
||||||
with two multilayer perceptron for the generator and discriminator.
|
with two multilayer perceptron for the generator and discriminator.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user