Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using locally downloaded Hugging Face models #125

Open
dmet6789 opened this issue Dec 2, 2024 · 4 comments
Open

Using locally downloaded Hugging Face models #125

dmet6789 opened this issue Dec 2, 2024 · 4 comments

Comments

@dmet6789
Copy link

dmet6789 commented Dec 2, 2024

Can I use Aisuit with Hugging Face models I've downloaded locally? For instance, can I specify models like models=['openai:gpt-4', 'huggingface:/models/llm/llama/llama3-7b-instruct']?
thanks

@Sourabhbhalke
Copy link

I'm getting thrown token error, i would also like to use local huggingface models

envs\o_Myenv\Lib\site-packages\huggingface_hub\hf_api.py", line 1672, in whoami
raise HTTPError(
requests.exceptions.HTTPError: Invalid user token. If you didn't pass a user token, make sure you are properly logged in by executing huggingface-cli login, and if you did pass a user token, double-check it's correct.

@ravenouse
Copy link

I believe it might be easier to serve the LLMs locally via Huggingface TGI, and add supports in the aisuite for TGI.

@reproduce-bot
Copy link

The following script is generated by AI Agent to help reproduce the issue:

# aisuite/reproduce.py
import pytest
from unittest.mock import patch
from aisuite.client import Client

def test_local_huggingface_model():
    provider_configs = {
        "huggingface": {"path": "/models/llm/llama/llama3-7b-instruct", "token": "fake_huggingface_token"}
    }
    with patch("aisuite.providers.huggingface_provider.httpx.post") as mock_post:
        mock_post.return_value.json.return_value = {"choices": [{"message": {"content": "Los Angeles Dodgers"}}]}
        client = Client()
        client.configure(provider_configs)
        messages = [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Who won the world series in 2020?"},
        ]

        model_str = "huggingface:/models/llm/llama/llama3-7b-instruct"
        try:
            model_response = client.chat.completions.create(model_str, messages=messages)
        except ValueError as e:
            if "Invalid provider key" in str(e):
                print("Test failed as expected with error:", e)
            else:
                raise AssertionError("Test failed with an unexpected error:", e)

if __name__ == "__main__":
    test_local_huggingface_model()

How to run:

python3 aisuite/reproduce.py

Thank you for your valuable contribution to this project and we appreciate your feedback! Please respond with an emoji if you find this script helpful. Feel free to comment below if any improvements are needed.

Best regards from an AI Agent!

@dmet6789

@OfficerChul
Copy link

It would be appreciated if aisuite allows us to use the the local model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants