8 Commits

Author SHA1 Message Date
606d0aa188 [Bug fix] redundant layers in ResNet
In https://github.com/yunjey/pytorch-tutorial/blob/master/tutorials/02-intermediate/deep_residual_network/main.py#L115, it defined a length 4 `layers`. 
But in https://github.com/yunjey/pytorch-tutorial/blob/master/tutorials/02-intermediate/deep_residual_network/main.py#L84, it only uses `layers[0]` and `layers[1]`. 
So the last entry of [2,2,2,2] should be redundant.
2018-11-06 17:54:07 +08:00
78c6afe681 Update tutorials for pytorch 0.4.0 2018-05-10 17:52:01 +09:00
15488e0db1 delete 2018-05-10 17:47:00 +09:00
b60ac38382 [Fix]invalid URL 2018-04-21 19:56:06 +09:00
f64c7c78c2 Removed unused code 2018-02-22 10:44:07 +01:00
be4c08b0ff Update main.py 2017-09-16 00:43:12 +09:00
8a824389d9 Update main.py
I got confused by the use of the binary cross entropy. In particular it wasn't clear to me why the variable real_labels are used in the training of the generator.

I have added some comments. I am not sure if they are correct, so you might want to double check them.
2017-09-15 18:05:51 +08:00
c548e2ae9f tutorial updated 2017-05-28 20:06:40 +09:00