Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dropout #195

Closed
wants to merge 11 commits into from
Closed

Dropout #195

wants to merge 11 commits into from

Conversation

ricor07
Copy link

@ricor07 ricor07 commented Jan 26, 2025

Fixed a conceptual error present in nf_dropout_layer_submodule.f90. In order to get sum(input) == sum(output) the definition of scale should be sum(input) / sum(input * self % mask)

Proof:

sum(input) == sum(output)
sum(input) == sum(input * mask * scale)

Since scale is constant,
sum(input) == scale * sum(input * mask)
But scale == sum(input) / sum(input * self % mask), so:
sum(input) == sum(input) * sum(input * mask) / sum(input * mask)

@ricor07 ricor07 mentioned this pull request Jan 26, 2025
5 tasks
@milancurcic
Copy link
Member

Thanks @ricor07. Can you make a PR against #194 rather than main?

@ricor07
Copy link
Author

ricor07 commented Jan 27, 2025

Sorry I'm not very used with GitHub. Could you tell me how?

@milancurcic
Copy link
Member

No problem and I appreciate your patience.

Here's how to submit a new PR against this one.

  1. Fork my fork of this repository (https://github.com/milancurcic/neural-fortran)
  2. Clone your newly forked repo: `git clone [email protected]:ricor07/neural-fortran (since you already have a fork of modern-fortran/neural-fortran, I'm not sure how GitHub will handle the name conflict; it may be that it will be called something else)
  3. After cloning it, do
cd neural-fortran
git remote add upstream https://github.com/milancurcic/neural-fortran
git checkout dropout
# make your changes, add, and commit
git push origin dropout # this will push the changes to your fork
  1. Navigate to https://github.com/milancurcic/neural-fortran and create a new PR; choose dropout as the target branch and your own fork's dropout branch as the source branch.

Let me know how it goes.

@ricor07 ricor07 closed this by deleting the head repository Jan 28, 2025
@ricor07
Copy link
Author

ricor07 commented Jan 28, 2025

I cloned your repo but it doesn't have the dropout layer file. Since it is only a line of change, could you change your code? Thank you.

In the next days, I wanted to work on #171. Is it possible? Thanks

@milancurcic
Copy link
Member

milancurcic commented Jan 28, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants