Relative Content

Tag Archive for gpuollamaubuntu-24.04

Ollama does not include a GPU (ubuntu 24.04)

ubuntu 24.04, 2x tesla T4. I am running llama3.1 using ollama, but it takes a very long time to process requests. Judging by nvidia-smi, both GPUs are in standby mode. Ollama was installed as
curl -fsSL https://ollama.com/install.sh | sh