-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FR] Compatibility with MLflow 2.0 #96
Comments
@shrinath-suresh @chauhang Do you have bandwidth on your end to migrate the |
@dbczumar I am testing the changes with MLflow 2.0. From MLflow 2.0, is it mandatory to return predict response of type Till MLflow 1.30.0, we are returning the result as a string (code link) and mlflow cli converts it to json (code link). Most of the ML examples (bert, mnist, iris classification), the output of the model will be either json or text. |
@shrinath-suresh So sorry for the delay here. The deployment client |
@shrinath-suresh Approved! Happy New Year! |
Has this been merged/solved and we are good to go with the latest version? |
Is this plugin still maintained? |
Thank you for submitting a feature request. Before proceeding, please review MLflow's Issue Policy for feature requests and the MLflow Contributing Guide.
Please fill in this feature request template to ensure a timely and thorough response.
Willingness to contribute
The MLflow Community encourages new feature contributions. Would you or another member of your organization be willing to contribute an implementation of this feature (as an enhancement to the MLflow TorchServe Deployment plugin code base)?
Proposal Summary
In MLflow 2.0 (scheduled for release on Nov. 14), we will be making small modifications to the MLflow Model Server's RESTful scoring protocol (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/models.html#deploy-mlflow-models) and the MLflow Deployment Client
predict()
API (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/python_api/mlflow.deployments.html#mlflow.deployments.BaseDeploymentClient.predict).For compatibility with MLflow 2.0, the
mlflow-torchserve
plugin will need to be updated to conform to the new scoring protocol and Deployment Client interface. The MLflow maintainers are happy to assist with this process, and we apologize for the short notice.Motivation
mlflow-torchserve
plugin will break in MLflow 2.0.What component(s) does this feature affect?
Components
area/deploy
: Main deployment plugin logicarea/build
: Build and test infrastructure for MLflow TorchServe Deployment Pluginarea/docs
: MLflow TorchServe Deployment Plugin documentation pagesarea/examples
: Example codeThe text was updated successfully, but these errors were encountered: