You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Edit docker-compose-gpu.yml to mount the volume containing the local llm model.
Then, attempt to run the model with a new agent. This will result in "no model found". (The docker log in the CLI will give more information, the error occurs right after loading the model after running the agent.)
Upload Error Log Content
backend-1 | error loading model: create_tensor: tensor 'blk.0.ffn_gate.weight' not found backend-1 | llama_load_model_from_file: failed to load model backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/helper/llm_loader.py:27] - backend-1 | from_string grammar: backend-1 | backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/controllers/models_controller.py:185] - Model not found. backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] - Error: backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] -
The text was updated successfully, but these errors were encountered:
Where are you using SuperAGI?
Linux
Which branch of SuperAGI are you using?
Main
Do you use OpenAI GPT-3.5 or GPT-4?
GPT-3.5
Which area covers your issue best?
Agents
Describe your issue.
Attempt to use nous-hermes-2-mixtral-8x7b-sft.Q4_K_M.gguf from TheBloke using the standard Local LLM loader shown in the youtube video released this January.
How to replicate your Issue?
Edit docker-compose-gpu.yml to mount the volume containing the local llm model.
Then, attempt to run the model with a new agent. This will result in "no model found". (The docker log in the CLI will give more information, the error occurs right after loading the model after running the agent.)
Upload Error Log Content
backend-1 | error loading model: create_tensor: tensor 'blk.0.ffn_gate.weight' not found backend-1 | llama_load_model_from_file: failed to load model backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/helper/llm_loader.py:27] - backend-1 | from_string grammar: backend-1 | backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - ERROR - [/app/superagi/controllers/models_controller.py:185] - Model not found. backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] - Error: backend-1 | 2024-02-04 22:11:06 UTC - Super AGI - INFO - [/app/superagi/controllers/models_controller.py:203] -
The text was updated successfully, but these errors were encountered: