Replies: 2 comments
-
It depends on many factors. If you use Ollama with GPU support you should focus on VRAM and CUDA cores. Also what goes above the VRAM will go into the normal RAM I think. But that slows down the whole GPU process. If you use CPU Mode then you are only limited by RAM and clock speed and cores of CPU. But in general more helps more, but does it make sense to upgrade the hardware? That can only answer the user himself. I don't think for a LLM that is only used for this here it makes sense to upgrade your hardware for the maximum. |
Beta Was this translation helpful? Give feedback.
-
ok, no maximum, but i'm using a Nvidia with 12 GB on a 64 GB machine. Does it make sense to put more RAM in this machine for paperless-ai and ollama? 128,192,256? Even though i'm the user, i'm not knowing (yet). |
Beta Was this translation helpful? Give feedback.
-
considering the most of us have consumer grade geforce GPUs or similiar, how important ist the RAM in the mix? Does it help to update lets say from 64 GB to 384 GB? That's my max i can go.....
Beta Was this translation helpful? Give feedback.
All reactions