Skip to content

Commit

Permalink
Fixing doc string
Browse files Browse the repository at this point in the history
  • Loading branch information
WenkelF committed Aug 4, 2023
1 parent 6463e48 commit da0d058
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions graphium/finetuning/finetuning_architecture.py
Original file line number Diff line number Diff line change
Expand Up @@ -230,10 +230,10 @@ def overwrite_with_pretrained(
Overwrite parameters shared between loaded and modified pretrained model
Parameters:
pretrained_model:
pretrained_model: Model from GRAPHIUM_PRETRAINED_MODELS_DICT
finetuning_module: Module to finetune from
added_depth: Number of modified layers at the end of finetuning module
sub_module_from_pretrained: Optional submodule to finetune from FINETUNING_HEADS_DICT
sub_module_from_pretrained: Optional submodule to finetune from
"""
module_map = self.net._module_map
module_map_from_pretrained = pretrained_model._module_map
Expand Down Expand Up @@ -292,7 +292,7 @@ def __init__(self, finetuning_head_kwargs: Dict[str, Any]):
Parameters:
finetuning_head_kwargs: Key-word arguments needed to instantiate a custom (or existing) finetuning head from
finetuning_head_kwargs: Key-word arguments needed to instantiate a custom (or existing) finetuning head from FINETUNING_HEADS_DICT
"""

Expand Down

0 comments on commit da0d058

Please sign in to comment.