-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: llama_cpp_python_cuda-0.2.6+cu117-cp310-cp310-manylinux_2_31_x86_64.whl is not a supported wheel on this platform. #14
Comments
In the past, this was caused by trying to use the wrong Python version. You might want to make absolutely sure that you aren't unintentionally calling the wrong Python. For instance, The alternative, if that isn't the issue, is much harder to solve. llama-cpp-python recently changed it's build backend to In theory, this is better, but if that isn't working correctly then there may not be anything I can do about it. I'm building the wheels using the oldest version of Linux offered by GitHub Actions for better system compatibility (Ubuntu 20.04). |
ALSO OCCURS in CUDA:11.8, even I have tryed with CUDA:12.2 |
CentOS uses an older version of glibc, making it incompatible with the current builds. CentOS 7.9 is a fairly old OS, released in 2009. I'm not even sure if it is worth supporting. |
Thx,I figure it out with rockylinux9.2 |
See the relevant issue with logs here: oobabooga/text-generation-webui#4005
Error about the wheel:
System: Ubuntu 22.04
CUDA: 11.7
Python: 3.10
The text was updated successfully, but these errors were encountered: