You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the instructions in the README to deploy tinychat on jetson agx orin and tried to get the llama-2-7b-chat model. However, Meta doesn't seem to provide a model in huggingface format, so I downloaded llama-2-7b-chat-hf uploaded by another user as an alternative.
When running the command in step 4 “Run the TinyChat demo”, I encountered the following error:
Traceback (most recent call last):
File "/SSD/codes/./llm-awq/tinychat/demo.py", line 175, in <module>
model = model_type_dict["llama"](config).half()
File "/SSD/codes/llm-awq/tinychat/models/llama.py", line 325, in __init__
self.model = Transformer(params)
File "/SSD/codes/llm-awq/tinychat/models/llama.py", line 281, in __init__
self.layers.append(TransformerBlock(layer_id, params))
File "/SSD/codes/llm-awq/tinychat/models/llama.py", line 250, in __init__
self.self_attn = LlamaAttentionFused(args)
File "/SSD/codes/llm-awq/tinychat/models/llama.py", line 85, in __init__
self.rope_theta = args.rope_theta
File "/SSD/anaconda3/envs/tinychat-test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 261, in __getattribute__
return super().__getattribute__(key)
AttributeError: 'LlamaConfig' object has no attribute 'rope_theta'
I searched for this error and found a similar case, where it seems that I need to upgrade the transformers library to a higher version.
Even though this contradicts the earlier instructions specifying transformers==4.32.0, I tried pip install --upgrade transformers to upgrade transformers to version 4.45.1, and was able to run the demo after that.
Although the problem has been solved for now, I still want to ask if this is normal and whether it will cause protential problems? Is there a better solution?
The text was updated successfully, but these errors were encountered:
I followed the instructions in the README to deploy tinychat on jetson agx orin and tried to get the llama-2-7b-chat model. However, Meta doesn't seem to provide a model in huggingface format, so I downloaded llama-2-7b-chat-hf uploaded by another user as an alternative.
When running the command in step 4 “Run the TinyChat demo”, I encountered the following error:
I searched for this error and found a similar case, where it seems that I need to upgrade the transformers library to a higher version.
Even though this contradicts the earlier instructions specifying transformers==4.32.0, I tried
pip install --upgrade transformers
to upgrade transformers to version 4.45.1, and was able to run the demo after that.Although the problem has been solved for now, I still want to ask if this is normal and whether it will cause protential problems? Is there a better solution?
The text was updated successfully, but these errors were encountered: