-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dropout layer #194
base: main
Are you sure you want to change the base?
Dropout layer #194
Conversation
Hello Milan. About one out of 25 times, when running test_dropout_layer this occurs: If you can't fix it today, I'll work on it tomorrow. Have a good afternoon Edit: I adjusted it in #195. There is still a case, in which all the units turn off. This occurs with a probability of (1 / dropout_rate) ^ n_units |
You're right, currently the the effective dropout rate is not exactly the specified Regarding the test failing at 1/25, notice that each test run makes 10000 dropout runs, so the effective failure rate is (10^-5). I think it's the precision issue (i.e. tolerance too small in the test), rather than the scaling of the outputs. However, you may still be correct that there's an issue with scaling the outputs, as you suggested in #195, I just don't think it's related to the test failure. |
I think we should handle the calculation of output values and even redefining the dropout, if needed. If you want to do a meeting in order to discuss this, i'm here. just answer me within 3 hours please |
Hi, I just looked at my code (haven't looked earlier when I wrote here), and I see the very serious and obvious bug in calculating the scale, which your approach fixes. No proof needed :). I'll write in the other PR on the process on opening a PR against this one, rather than a new one. |
Closes #170.