-
Hi, I am trying to override the activation function of my neural network in an experiment group. Unfortunately, I seem to always run into the error
Here is the corresponding minimal example. from typing import Type
from torch import nn
from hydra_zen import to_yaml, store, builds, zen, launch, make_config
from omegaconf import DictConfig
class Model:
def __init__(self, activation_fn: Type[nn.Module] = nn.ReLU):
self.activation_fn = activation_fn
def app(zen_cfg: DictConfig, model: Model) -> None:
print(to_yaml(zen_cfg, resolve=True))
store(Model, group='model')
Config = builds(
app,
populate_full_signature=True,
hydra_defaults=[
'_self_',
{'model': 'Model'},
{'experiment': 'selu'},
]
)
experiment_store = store(group='experiment', package='_global_')
experiment_store(
make_config(
hydra_defaults=['_self_'],
model=dict(activation_fn=nn.SELU),
bases=(Config,)
),
name='selu'
)
if __name__ == '__main__':
store.add_to_hydra_store()
launch(Config, zen(app), version_base='1.3') Is there anything I can do? I have already tried specifying the activation function in a separate config group. Is there maybe a way to make the node writable? Or do I have to pass the activation function as a partially instantiated object? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 10 replies
-
Ah this maybe an unintentional side effect of If that is the case, I will happily change it to unblock this workflow. |
Beta Was this translation helpful? Give feedback.
@DirkKuhn
v0.11.0rc2
has been released and contains the fix to this issue. Thanks for your patience.