-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data-3460: Adding GetInferenceResponse #608
Conversation
6769bfb
to
1879125
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just one Q for you, otherwise looks good!
viam.service.mlmodel.v1.FlatTensors output_tensors = 1; | ||
viam.app.data.v1.Annotations annotations = 2; | ||
// anything else you want to say | ||
google.protobuf.Struct extra = 99; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there's pretty low risk of needing to deprecate this field, but I would hate to introduce a backwards-incompatible breaking change over something as minor as an extra field. Is this intended to be used for something immediately? If not, I would err on just adding it as needed. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK! That makes sense to me. Just thought it might be helpful because I find myself wanting to pass more info around in the MLModel in delphi. But we can add it if we need.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we wanted to pass more info around in the MLModel in Delphi, wouldn't we then be working with the ML model proto? https://github.com/viamrobotics/api/blob/main/proto/viam/service/mlmodel/v1/mlmodel.proto
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, fixed it up! Thanks for this!
@@ -20,4 +22,9 @@ message GetInferenceRequest { | |||
string organization_id = 4; | |||
} | |||
|
|||
message GetInferenceResponse {} | |||
message GetInferenceResponse { | |||
viam.service.mlmodel.v1.FlatTensors output_tensors = 1; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wooh, awesome. Thanks for maintaining parity with the ML model proto defs :)
8910d43
to
4fc5c05
Compare
Context: https://viam.atlassian.net/browse/DATA-3460?atlOrigin=eyJpIjoiMmVjNmJkOWU3MTlkNDA3ODgwODdiMWE4OTYxYWYwNjkiLCJwIjoiaiJ9
Added FlatTensors, Annotations and an Extra struct (so that we can add any potential metadata we want) to the GetInferenceResponse.