Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DRAFT][tflchef] Support multiple subgraphs #12516

Closed
wants to merge 2 commits into from

Conversation

BalyshevArtem
Copy link
Contributor

This draft adds supporting of multiple subgraphs and While operation.

for issue #12506

ONE-DCO-1.0-Signed-off-by: Artem Balyshev [email protected]

@BalyshevArtem BalyshevArtem added the DRAFT A draft issue or PR for sharing one's current working status and discussion. label Jan 23, 2024
Comment on lines +1 to +15
/*
* Copyright (c) 2024 Samsung Electronics Co., Ltd. All Rights Reserved
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add While operation for creating Net_DecomposedGru_000 #12319 (comment)

Comment on lines +28 to +33
const auto &inputs = *op->inputs();

for (int input : inputs)
{
fill_tensor_to_import(input, import);
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added due to Less can have Const input as in example for #12319 (comment)

This draft adds supporting of multiple subgraphs and While operation.

ONE-DCO-1.0-Signed-off-by: Artem Balyshev <[email protected]>
operand->set_name(mio::tflite::tensor_name(tensor));
operand->set_type(as_tflchef_type(tensor->type()));
operand->set_is_variable(tensor->is_variable());
tflchef::Graph *graph;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

graph will be uninitialized when n == 0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

if (shape_signature != nullptr)
{
tflchef::ShapeSignature *chef_shape_signature = operand->mutable_shape_signature();
for (uint32_t j = 0; j < shape_signature->size(); ++j)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q) why is this changed to j ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because i is already used in above for loop:

for (uint32_t i = 0; i < tensors->size(); ++i)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because i is already used in above for loop:

Then why is the loop before was OK but now nested?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a bit hard to read the changes even with Hide whitespace

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because i is already used in above for loop:

Then why is the loop before was OK but now nested?

Before changes this loop was nested too, and it was like shadows a local variable problem.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, got it, thanks! Can you fire a PR for this change in separate PR ?

@BalyshevArtem
Copy link
Contributor Author

@seanshpark, Can I split this draft into PRs? Or need more changes?

@seanshpark
Copy link
Contributor

Can I split this draft into PRs? Or need more changes?

Lets' go :)

@BalyshevArtem
Copy link
Contributor Author

All merged, close it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DRAFT A draft issue or PR for sharing one's current working status and discussion.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants