Installing ollama not possible in KVM image install

Hello,

after installation of HA (KVM qcow2) on proxmox (went fine) I want to install Ollama. After running the install script (curl -fsSL https://ollama.com/install.sh | sh) it seems ollama is installed, but it never starts, cant run

➜ local ollama run qwen3
[1] 768 segmentation fault (core dumped) ollama run qwen3
➜ local

It seems to be related to the linux HA is on, its Alpine Linux.

Is there a way to install olloma on this Alpine linux?

I am a bt forced to use this now, as the Supervised installation methods are being decrapicated

Hi

Apologies I have no idea what Ollama is or does but a bit of googling suggests the following

Not sure if this helps?

Well, you respond with the integration Ollama in HA, thanks for that but that is some i do know.

Apart from this integration, you also need Ollama set up in the server itself, which is done with given curl command.
On Debian, it works flawless, and currently i have a debian setup with HA supervisor on top.

But as this supervisor setup is not supported anymore, i did try some other setp which was a VM (proxmox).
Issue here is, the HA for VM is Alpin linux based, which is some special linux flov which can not handle this Ollama curl install command.

Well, i guess i am holding on to my Debian/HA setup, although it is not supported anymore. Thanks for the effort though

Yeah, trying to run Ollama directly in HAOS environment has a high probability of …well… not working. Run Ollama on the Host and let the HA on the VM talk to it.

1 Like

Please share qm config VMIDHERE from the PVE node.

You will need to setup a second vm to run ollama then connect to that preferably with your GPU /NPU passed through for inference. You won’t run it inside the same VM as haos. But you’re in Proxmox so spinning a new VM is what it does.

Sorry for delay, but I was away for some time.

Yes, i did have that before, a 2nd VM having the Ollama instance.
But as i have HA running on Debian, i did change it all to Ollama running on same node.
(Yes I have a GPU passthrough, but it is taken for a Comfyui instance so its running on purely cpu).

As HA supervised is not supported anymore, i tried to move to a other available install options, but for proxmox onlyu option is Linux - Home Assistant , then the KVM flavour. And as said, this one does not support the install of Ollama. And sadly other HA install methods are not suited for proxmox. So for me, HA is taking a huge step back in supporting install options, but I understand, supporting all the various flavours is a but much.

Leaves me for keeping my Debian as long as possible, or indeed having a 2nd VM for Ollama.

thanks for all your thoughts!

Haos works fine in prox as it’s own VM. Run ollama as it’s own thing.