Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR] Compatibility with MLflow 2.0 #96

Open
2 of 7 tasks
dbczumar opened this issue Nov 2, 2022 · 8 comments
Open
2 of 7 tasks

[FR] Compatibility with MLflow 2.0 #96

dbczumar opened this issue Nov 2, 2022 · 8 comments

Comments

@dbczumar
Copy link

dbczumar commented Nov 2, 2022

Thank you for submitting a feature request. Before proceeding, please review MLflow's Issue Policy for feature requests and the MLflow Contributing Guide.

Please fill in this feature request template to ensure a timely and thorough response.

Willingness to contribute

The MLflow Community encourages new feature contributions. Would you or another member of your organization be willing to contribute an implementation of this feature (as an enhancement to the MLflow TorchServe Deployment plugin code base)?

  • Yes. I can contribute this feature independently.
  • Yes. I would be willing to contribute this feature with guidance from the MLflow community.
  • No. I cannot contribute this feature at this time.

Proposal Summary

In MLflow 2.0 (scheduled for release on Nov. 14), we will be making small modifications to the MLflow Model Server's RESTful scoring protocol (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/models.html#deploy-mlflow-models) and the MLflow Deployment Client predict() API (documented here: https://output.circle-artifacts.com/output/job/bb07270e-1101-421c-901c-01e72bc7b6df/artifacts/0/docs/build/html/python_api/mlflow.deployments.html#mlflow.deployments.BaseDeploymentClient.predict).

For compatibility with MLflow 2.0, the mlflow-torchserve plugin will need to be updated to conform to the new scoring protocol and Deployment Client interface. The MLflow maintainers are happy to assist with this process, and we apologize for the short notice.

Motivation

  • What is the use case for this feature? Provide a richer, more extensible scoring protocol and broaden the deployment client prediction interface beyond dataframe inputs.
  • Why is this use case valuable to support for MLflow TorchServe Deployment plugin users in general? Necessary for compatibility for MLflow 2.0
  • Why is this use case valuable to support for your project(s) or organization? ^
  • Why is it currently difficult to achieve this use case? Without these changes, the mlflow-torchserve plugin will break in MLflow 2.0.

What component(s) does this feature affect?

Components

  • area/deploy: Main deployment plugin logic
  • area/build: Build and test infrastructure for MLflow TorchServe Deployment Plugin
  • area/docs: MLflow TorchServe Deployment Plugin documentation pages
  • area/examples: Example code
@dbczumar
Copy link
Author

dbczumar commented Nov 2, 2022

@shrinath-suresh @chauhang Do you have bandwidth on your end to migrate the mlflow-torchserve plugin to the updated scoring protocol and adjust the Deployment Client predict() API? By my estimates, it should only take a few hours of work at most. Apologies for the short notice.

@dbczumar dbczumar changed the title [FR] Compatibility with MLfflow 2.0 [FR] Compatibility with MLflow 2.0 Nov 2, 2022
@shrinath-suresh
Copy link
Collaborator

@dbczumar I am testing the changes with MLflow 2.0.

From MLflow 2.0, is it mandatory to return predict response of type pd.DataFrame or pd.Series ? . Getting error while converting text/json due to result.to_json() convertion done here

Till MLflow 1.30.0, we are returning the result as a string (code link) and mlflow cli converts it to json (code link).

Most of the ML examples (bert, mnist, iris classification), the output of the model will be either json or text.
If the changes are intentional, we need to change the mlflow_torchserve plugin's predict method to convert the user response to dataframe and return it to mlflow cli

@shrinath-suresh
Copy link
Collaborator

@dbczumar @harupy

@dbczumar
Copy link
Author

dbczumar commented Dec 2, 2022

@shrinath-suresh So sorry for the delay here. The deployment client predict() method always needs to return a PredictionsResponse object (https://mlflow.org/docs/latest/python_api/mlflow.deployments.html#mlflow.deployments.PredictionsResponse). For your use case, feel free to subclass PredictionsResponse and define to_json() as desired :)

@shrinath-suresh
Copy link
Collaborator

@dbczumar Can you please review this PR when you find time ? #97

@dbczumar
Copy link
Author

@shrinath-suresh Approved! Happy New Year!

@Habardeen
Copy link

Habardeen commented Feb 17, 2023

Has this been merged/solved and we are good to go with the latest version?
Looks like it by #97 but just wanted to confirm if there/s anything outstanding since Issue is still open

@bhack
Copy link

bhack commented Nov 14, 2023

Is this plugin still maintained?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants