Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[onert] Introduce backpropActivation to OperationUtils #12493

Closed
wants to merge 1 commit into from

Conversation

zetwhite
Copy link
Contributor

@zetwhite zetwhite commented Jan 17, 2024

This PR introduces backpropActivation function to OperationUtils.
This function is to call proper cker according to ir::Activation.

ONE-DCO-1.0-Signed-off-by: SeungHui Youn [email protected]

issue : #12388
draft : #12395

This PR introduces backpropActivation function to OperationUtils.
This function is to call proper cker according to ir::Activation.

ONE-DCO-1.0-Signed-off-by: SeungHui Youn <[email protected]>
Comment on lines +59 to +62
const IPortableTensor *backpropActivation(const ir::Activation &activation,
const IPortableTensor *output,
const IPortableTensor *input_backprop,
IPortableTensor *output_backprop = nullptr);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd llike to call this function in each training op layer - backward().

use example: https://github.com/Samsung/ONE/pull/12492/files#r1454986967

@zetwhite zetwhite added the PR/ready for review It is ready to review. Please review it. label Jan 17, 2024
@zetwhite zetwhite closed this Jan 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR/ready for review It is ready to review. Please review it.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant