Force Ollama to Use Your AMD GPU (even if it's not officially supported)

Описание к видео Force Ollama to Use Your AMD GPU (even if it's not officially supported)

You only thought Ollama was using your GPU! If your graphics card is not officially supported then it will use your CPU rather than utilize your GPU. And sometimes that makes the performance of your local chatbot painfully slow. But there is a workaround called Ollama for AMD. In this session, I walk you through how to get it running and force Ollama to use your GPU.

0:00 Intro (the problem)
1:56 Ollama for AMD (the solution)
2:41 Words of caution
3:48 Explaining the process
4:49 Installing the demo release version Ollama for AMD
5:39 Replace ROCm libraries
9:07 A few important additional notes
10:17 Testing
11:55 The wrap up

Ollama for AMD on GitHub
https://github.com/likelovewant/ollam...

Комментарии

Информация по комментариям в разработке