LocalLLaMA
turkishdelight
•
7mo ago
•
100%
Ollama now supports AMD graphics cards
ollama.comBut in all fairness, it's really llama.cpp that supports AMD.
Now looking forward to the Vulkan support!
Comments 4