diff --git a/Readme.md b/Readme.md
index 425de04..5e24f5b 100644
--- a/Readme.md
+++ b/Readme.md
@@ -5,7 +5,7 @@
[](https://github.com/helblazer811/ManimMachineLearning/blob/main/LICENSE.md)
[](https://img.shields.io/github/v/release/helblazer811/ManimMachineLearning)
-[](https://GitHub.com/helblazer811/ManimMachineLearning/releases/)
+
[](https://twitter.com/alec_helbling)
Manim Machine Learning is a project focused on providing animations and visualizations of common machine learning concepts with the [Manim Community Library](https://www.manim.community/). We want this project to be a compilation of primitive visualizations that can be easily combined to create videos about complex machine learning concepts. Additionally, we want to provide a set of abstractions which allow users to focus on explanations instead of software engineering.
@@ -16,7 +16,12 @@ Manim Machine Learning is a project focused on providing animations and visualiz
2. [Examples](#examples)
## Getting Started
-First you will want to [install manim](https://docs.manim.community/en/stable/installation.html). Then you can run the following to generate the example videos.
+First you will want to [install manim](https://docs.manim.community/en/stable/installation.html).
+
+Then install the package form source or
+`pip install manim_ml`
+
+Then you can run the following to generate the example videos from python scripts.
`manim -pqh src/vae.py VAEScene`
@@ -24,20 +29,49 @@ First you will want to [install manim](https://docs.manim.community/en/stable/in
Checkout the ```examples``` directory for some example videos with source code.
+### Neural Networks
+
+This is a visualization of a Neural Network made using ManimML. It has a Pytorch style list of layers that can be composed in arbitrary order. The following video is made with the code from below.
+
+
+
+```python
+from manim import *
+from manim_ml.neural_network.layers import FeedForwardLayer, ImageLayer
+from manim_ml.neural_network.neural_network import NeuralNetwork
+from PIL import Image
+import numpy as np
+
+class ImageNeuralNetworkScene(Scene):
+
+ def construct(self):
+ image = Image.open('images/image.jpeg')
+ numpy_image = np.asarray(image)
+ # Make nn
+ layers = [
+ ImageLayer(numpy_image, height=1.0),
+ FeedForwardLayer(3),
+ FeedForwardLayer(5),
+ FeedForwardLayer(3)
+ ]
+ nn = NeuralNetwork(layers)
+ # Center the nn
+ nn.move_to(ORIGIN)
+ self.add(nn)
+ # Play animation
+ self.play(nn.make_forward_pass_animation())
+```
+
+
### Variational Autoencoders
This is a visualization of a Variational Autoencoder.
-
+
### VAE Disentanglement
This is a visualization of disentanglement with a Variational Autoencoder
-
+
-### Neural Networks
-
-This is a visualization of a Neural Network.
-
-
diff --git a/examples/media/ImageNeuralNetworkScene.gif b/examples/media/ImageNeuralNetworkScene.gif
new file mode 100644
index 0000000..eaa1ad8
Binary files /dev/null and b/examples/media/ImageNeuralNetworkScene.gif differ
diff --git a/tests/test_neural_network.py b/tests/test_neural_network.py
index b0beed8..0b662ad 100644
--- a/tests/test_neural_network.py
+++ b/tests/test_neural_network.py
@@ -44,7 +44,6 @@ class ImageNeuralNetworkScene(Scene):
ImageLayer(numpy_image, height=1.0),
FeedForwardLayer(3),
FeedForwardLayer(5),
- FeedForwardLayer(3),
FeedForwardLayer(3)
]
nn = NeuralNetwork(layers)