You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Jeffry Samuel edited this page Apr 8, 2025
·
9 revisions
Flatpak
Alpaca used to include Ollama in it's Flatpak packages, this changed to make Ollama optional.
Flathub
Go to Alpaca's store page in your system's app store and look for the extension called Ollama Instance and install it, then reopen Alpaca and enjoy running local models!
You can also install the extension with this command:
# Check which installation type you have
flatpak list --columns=app,installation | grep Alpaca
# If you have a system installation
flatpak install com.jeffser.Alpaca.Plugins.Ollama
# If you have a user installation
flatpak install --user com.jeffser.Alpaca.Plugins.Ollama
AMD GPU Support
AMD GPUs require ROCm to be used with AI tools, Alpaca also packages it as an extension, so, in addition of com.jeffser.Alpaca.Plugins.Ollama you will also need to install com.jeffser.Alpaca.Plugins.AMD, available as an Alpaca extension in your system's app store as Alpaca AMD Support.
Arch Linux
Important
Alpaca doesn't support Arch Linux officially
Ollama is installed in Arch Linux the same way as any other package, the base ollama package is available in the stable repo, whilst there are other alternatives available in the AUR.