-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ GPU ] split kernel registration from forwarding function in rmsnorm_layer_cl
#2804
[ GPU ] split kernel registration from forwarding function in rmsnorm_layer_cl
#2804
Conversation
📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2804. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
0be5f29
to
82007d8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
82007d8
to
ba561d3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
- This commit updates reshape_cl.cpp/.h to inherit LayerImplCl. - This commit implements registerClKernels(), which is called in context_cl.cpp Self evaluation: Build test: [X]Passed [ ]Failed [ ]Skipped Run test: [X]Passed [ ]Failed [ ]Skipped Signed-off-by: Eunju Yang <[email protected]>
ba561d3
to
e03174f
Compare
rmsnorm_layer_cl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
rmsnormProcess_fp16(in, out, gamma, epsilon); | ||
#endif |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it might be good to throw an error when fp16 is not enabled!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the comment. I updated it :)
- This commit updates rmsnorm_layer_cl.cpp/h to inherit LayerImplCl. - This commit implements registerClKernels() of rmsnorm layer. - This commit update cl_context.cpp (applying rmsnorm_layer_cl's update) - This commit update common_properties.h (adding property for rmsnormlayer) Self evaluation: Build test: [X]Passed [ ]Failed [ ]Skipped Run test: [X]Passed [ ]Failed [ ]Skipped Signed-off-by: Eunju Yang <[email protected]>
e03174f
to
8702596
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
nntrainer/layers/common_properties.h
Outdated
/** | ||
* @brief RMS_NORM_GAMMA_INIT Initialization Enumeration Information | ||
*/ | ||
class RMS_NORM_GAMMA_INIT final : public EnumProperty<InitializerInfo> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is quite strange to use capital letters here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh I checked BNPPARAMS_*_INIT only.
As you suggested, it seems more natural to update it to follow our convention :) I will apply it
nntrainer/layers/common_properties.h
Outdated
*/ | ||
RMS_NORM_GAMMA_INIT(Initializer value = Initializer::ONES) { set(value); }; | ||
using prop_tag = enum_class_prop_tag; | ||
static constexpr const char *key = "gamma_initializer"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that there is a duplication of key value ("gamma_initializer") for BNPARAMS_GAMMA_INIT.
How about changing the BNPARAMS_GAMMA_INIT class to GammaInitializer and using it in common?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you! That's my bad. It must be critical if it wasn't fixed. I will update it.
- This commit updates some bn layer's properties to make them follow naming convention - BNPARAMS_GAMMA_INIT -> GammaInitializer - BNPARAMS_MU_INIT -> MuInitializer - BNPARAMS_VAR_INIT -> VarInitializer - BNPARAMS_BETA_INIT -> BetaInitializer - This commit replaces parts of using RMS_NORM_GAMMA_INIT to GammaInitializer **Self evaluation:** Build test: [X]Passed [ ]Failed [ ]Skipped Run test: [X]Passed [ ]Failed [ ]Skipped Signed-off-by: Eunju Yang <[email protected]>
Reflecting comments from @jijoongmoon and @baek2sm , I added a new commit 78907d7 updating |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work. LGTM!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
2024-12-02 New commit is added :
78907d7
naming convention
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped