localllama
LocalLLaMA turkishdelight 7mo ago 100%

Ollama now supports AMD graphics cards

ollama.com

But in all fairness, it's really llama.cpp that supports AMD.

Now looking forward to the Vulkan support!

77
4
Comments 4