Skip to content

Unetr bottleneck layer weight freezing for transfer learning #1376

Discussion options

You must be logged in to vote

@tangy5 Sorry, I want to know bottleneck feature about SwinUnetr as well

Here are SwinUnetr layer weights:

No layers
1 swinViT.patch_embed.proj.weight
2 swinViT.patch_embed.proj.bias
3 swinViT.layers1.0.blocks.0.norm1.weight
4 swinViT.layers1.0.blocks.0.norm1.bias
5 swinViT.layers1.0.blocks.0.attn.relative_position_bias_table
6 swinViT.layers1.0.blocks.0.attn.qkv.weight
7 swinViT.layers1.0.blocks.0.attn.qkv.bias
8 swinViT.layers1.0.blocks.0.attn.proj.weight
9 swinViT.layers1.0.blocks.0.attn.proj.bias
10 swinViT.layers1.0.blocks.0.norm2.weight
11 swinViT.layers1.0.blocks.0.norm2.bias
12 swinViT.layers1.0.blocks.0.mlp.linear1.weight
13 swinViT.layers1.0.block…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@justinzhang528
Comment options

@justinzhang528
Comment options

Answer selected by justinzhang528
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants