Skip to content

Commit

Permalink
fix MLPBlock hidden_dim
Browse files Browse the repository at this point in the history
  • Loading branch information
sfluegel05 committed Feb 4, 2025
1 parent b7ca0e5 commit ad24fa7
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion chebai/models/ffn.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ def __init__(
current_layer_input_size = input_size
for hidden_dim in hidden_layers:
layers.append(MLPBlock(current_layer_input_size, hidden_dim))
layers.append(Residual(MLPBlock(current_layer_input_size, hidden_dim)))
layers.append(Residual(MLPBlock(hidden_dim, hidden_dim)))
current_layer_input_size = hidden_dim

layers.append(torch.nn.Linear(current_layer_input_size, self.out_dim))
Expand Down

0 comments on commit ad24fa7

Please sign in to comment.