-
Notifications
You must be signed in to change notification settings - Fork 273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ResBlock with inplace relu? #60
Comments
Oh...Definitely it is a bug. Thank you! Fixed at ef5f67c. |
@rfeinman How does this affect the result? Can you please explain in detail? |
@SURABHI-GUPTA Assume that "input" is the pre-relu activation from the previous layer. The target computation is the following: output = self.conv(input)
output += input However the way the code was written, the variable output = self.conv(input)
output += F.relu(input) But this has been fixed now with the new commit. |
@rfeinman @SURABHI-GUPTA I am also trying to figure out what the implication of this additional operation could be. It could work as an additional skip connection. I can only see training performance (in terms of speed) degradation. Any further insights? |
@fostiropoulos I can't speak as to whether it is a better choice to pass forward the pre-activation or post-activation values for the skip connection. In popular residual architectures for classification (e.g. ResNet) they do the latter. In this repository, they chose the former. |
I noticed in the
ResBlock
class of vqvae.py that you put a ReLU activation at the start of the residual stack, and no ReLU at the end:I think the point is to pass forward the pre-activation values from the previous layer as the identity (the skip connection), rather than post-activation. However, since you have
inplace=True
for the start ReLU, you actually end up getting the latter. Is that intentional or a bug?The text was updated successfully, but these errors were encountered: