Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Residual vs attentional blocks #29

Open
valillon opened this issue May 16, 2021 · 0 comments
Open

Residual vs attentional blocks #29

valillon opened this issue May 16, 2021 · 0 comments

Comments

@valillon
Copy link

All generator and discriminator types implemented here are made of either block() or block_no_sn() modules, which either way have internally a residual connection x_0 + x by default. However, in the associated paper residual vs. attentional blocks are compared as if both architectures were exclusive, one or the other. So, does the attentional architecture reported in the paper includes also residual blocks or this implementation does not fully follow the reported architectures?

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant