Replies: 1 comment
-
Exact same result here. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all, looking for a little guidance here. I built a brand-new test VM with Ubuntu 22.04, just the basic server. Installed all the prereqs per the instructions in the README.md file, everything builds fine, I can see it was compiled with clang 18. However, when I run run_inference.py I get blank output using the model described in the README.md file. I've tried using other models, some have spit out gibberish, another just output hashtags. Any ideas?
Beta Was this translation helpful? Give feedback.
All reactions