Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data-3460: Adding GetInferenceResponse #608

Merged
merged 16 commits into from
Dec 20, 2024
Merged

Conversation

etai-shuchatowitz
Copy link
Member

Context: https://viam.atlassian.net/browse/DATA-3460?atlOrigin=eyJpIjoiMmVjNmJkOWU3MTlkNDA3ODgwODdiMWE4OTYxYWYwNjkiLCJwIjoiaiJ9

Added FlatTensors, Annotations and an Extra struct (so that we can add any potential metadata we want) to the GetInferenceResponse.

@etai-shuchatowitz etai-shuchatowitz added the ready-for-protos add this when you want protos to compile on every commit label Dec 19, 2024
@github-actions github-actions bot added the safe to test committer is a member of this org label Dec 19, 2024
@etai-shuchatowitz etai-shuchatowitz added ready-for-protos add this when you want protos to compile on every commit and removed protos-compiled ready-for-protos add this when you want protos to compile on every commit labels Dec 19, 2024
Copy link
Member

@tahiyasalam tahiyasalam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one Q for you, otherwise looks good!

viam.service.mlmodel.v1.FlatTensors output_tensors = 1;
viam.app.data.v1.Annotations annotations = 2;
// anything else you want to say
google.protobuf.Struct extra = 99;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there's pretty low risk of needing to deprecate this field, but I would hate to introduce a backwards-incompatible breaking change over something as minor as an extra field. Is this intended to be used for something immediately? If not, I would err on just adding it as needed. WDYT?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK! That makes sense to me. Just thought it might be helpful because I find myself wanting to pass more info around in the MLModel in delphi. But we can add it if we need.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we wanted to pass more info around in the MLModel in Delphi, wouldn't we then be working with the ML model proto? https://github.com/viamrobotics/api/blob/main/proto/viam/service/mlmodel/v1/mlmodel.proto

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, fixed it up! Thanks for this!

@@ -20,4 +22,9 @@ message GetInferenceRequest {
string organization_id = 4;
}

message GetInferenceResponse {}
message GetInferenceResponse {
viam.service.mlmodel.v1.FlatTensors output_tensors = 1;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wooh, awesome. Thanks for maintaining parity with the ML model proto defs :)

@etai-shuchatowitz etai-shuchatowitz removed the ready-for-protos add this when you want protos to compile on every commit label Dec 20, 2024
@etai-shuchatowitz etai-shuchatowitz merged commit f836d9d into main Dec 20, 2024
3 checks passed
@etai-shuchatowitz etai-shuchatowitz deleted the DATA-3460/flat-tensor branch December 20, 2024 15:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
protos-compiled safe to test committer is a member of this org
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants