You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All generator and discriminator types implemented here are made of either block() or block_no_sn() modules, which either way have internally a residual connection x_0 + x by default. However, in the associated paper residual vs. attentional blocks are compared as if both architectures were exclusive, one or the other. So, does the attentional architecture reported in the paper includes also residual blocks or this implementation does not fully follow the reported architectures?
Thanks.
The text was updated successfully, but these errors were encountered:
All generator and discriminator types implemented here are made of either
block()
orblock_no_sn()
modules, which either way have internally a residual connectionx_0 + x
by default. However, in the associated paper residual vs. attentional blocks are compared as if both architectures were exclusive, one or the other. So, does the attentional architecture reported in the paper includes also residual blocks or this implementation does not fully follow the reported architectures?Thanks.
The text was updated successfully, but these errors were encountered: