-
Notifications
You must be signed in to change notification settings - Fork 870
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using locally downloaded Hugging Face models #125
Comments
I'm getting thrown token error, i would also like to use local huggingface models envs\o_Myenv\Lib\site-packages\huggingface_hub\hf_api.py", line 1672, in whoami |
I believe it might be easier to serve the LLMs locally via Huggingface TGI, and add supports in the |
The following script is generated by AI Agent to help reproduce the issue: # aisuite/reproduce.py
import pytest
from unittest.mock import patch
from aisuite.client import Client
def test_local_huggingface_model():
provider_configs = {
"huggingface": {"path": "/models/llm/llama/llama3-7b-instruct", "token": "fake_huggingface_token"}
}
with patch("aisuite.providers.huggingface_provider.httpx.post") as mock_post:
mock_post.return_value.json.return_value = {"choices": [{"message": {"content": "Los Angeles Dodgers"}}]}
client = Client()
client.configure(provider_configs)
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
]
model_str = "huggingface:/models/llm/llama/llama3-7b-instruct"
try:
model_response = client.chat.completions.create(model_str, messages=messages)
except ValueError as e:
if "Invalid provider key" in str(e):
print("Test failed as expected with error:", e)
else:
raise AssertionError("Test failed with an unexpected error:", e)
if __name__ == "__main__":
test_local_huggingface_model() How to run: python3 aisuite/reproduce.py Thank you for your valuable contribution to this project and we appreciate your feedback! Please respond with an emoji if you find this script helpful. Feel free to comment below if any improvements are needed. Best regards from an AI Agent! |
It would be appreciated if aisuite allows us to use the the local model. |
Can I use Aisuit with Hugging Face models I've downloaded locally? For instance, can I specify models like models=['openai:gpt-4', 'huggingface:/models/llm/llama/llama3-7b-instruct']?
thanks
The text was updated successfully, but these errors were encountered: