-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[onert-micro] Introduce Optimizers #13211
[onert-micro] Introduce Optimizers #13211
Conversation
This pr introduces optimizers entities: SGD and Adam. ONE-DCO-1.0-Signed-off-by: Artem Balyshev <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
float epsilon = training_config.epsilon; | ||
for (uint32_t i = 0; i < flat_size; ++i) | ||
{ | ||
float exponent_corrected = exponent_data[i] / (1.f - beta_in_pow_batch); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should check the division by zero?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added checks
auto batches = static_cast<float>(training_config.batch_size); | ||
for (uint32_t i = 0; i < flat_size; ++i) | ||
{ | ||
const auto cur_val = calculated_data[i] / batches; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const auto cur_val = calculated_data[i] / batches; | |
const auto cur_val = calculated_data[i]; |
not sure but, this one is correct ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done, thank you, you are right
This pr introduces optimizers entities: SGD and Adam.
for issue #12873
from draft: #13107
ONE-DCO-1.0-Signed-off-by: Artem Balyshev [email protected]