Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why i cannot use my gpu #12651

Open
jackphj opened this issue Jan 4, 2025 · 3 comments
Open

Why i cannot use my gpu #12651

jackphj opened this issue Jan 4, 2025 · 3 comments

Comments

@jackphj
Copy link

jackphj commented Jan 4, 2025

i created conda env llm-cpp
and then
conda activate llm-cpp
init-ollama.bat
$env:no_proxy = "localhost,127.0.0.1"; $env:ZES_ENABLE_SYSMAN = "1"; $env:OLLAMA_NUM_GPU = "999" ;$env:SYCL_CACHE_PERSISTENT = "1"; $env:SYCL_PI_LEVEL_ZERO_USE_IMMEDIATE_COMMANDLISTS = "1"
and then
.\ollama.exe serve

image

as we can see, it doesn't match the situation in the link

CPU: intel ultra7 258v
GPU: intergated Arc 140v

link:
https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_quickstart.md

@sgwhat
Copy link
Contributor

sgwhat commented Jan 6, 2025

Hi @jackphj,

May I ask why you think ollama isn't running on a GPU. Have you tested running a model with ollama? If you have more questions, please provide the complete log from ollama server side.

@jackphj
Copy link
Author

jackphj commented Jan 6, 2025

Hi @jackphj,

May I ask why you think ollama isn't running on a GPU. Have you tested running a model with ollama? If you have more questions, please provide the complete log from ollama server side.请问您为什么认为 ollama 不在 GPU 上运行。您是否测试过使用 ollama 运行模型?如果您还有其他问题,请提供来自 ollama 服务器端的完整日志。

Because I found that the log in the picture above says OLLAMA_INTEL_GPU:FALSE.

@sgwhat
Copy link
Contributor

sgwhat commented Jan 6, 2025

Because I found that the log in the picture above says OLLAMA_INTEL_GPU:FALSE.

Sorry that's a confusing and useless log, it's meaningless, I will delete it later. You may run your model via ollama run <models> to check.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants