LlamaGPT can not be integrated in HA

Hi,

I installed LlamaGPT on my Synology NAS and it is running fine.
Installed with portainer, following this explanation: How to Install LlamaGPT on Your Synology NAS – Marius Hosting
Now I want to integrate this local AI system into AI. My Idea was to do this with the Local LLM Conversation integration.
Adding this integration I am asked to select the backend for the running model.
I thougth that it is no. 4: Generic OpenAI API Compatible API
Is this right?
If yes, I am asked to fill in the hostname and the API Port.
Portainer installed 2 containers on my NAS:

  1. LlamaGPT
  2. LlamaGTP-API

The LlamaGPT can be reached on http://Synology-ip-address:3136
What is the right hostname and Port?
The problem seems to be, that I have to name the API path
I don’t know where I could find the path

Any ideas?

Hi Lars Künnemann,

You may already know, but so far these are the LLM’s that HA officially works with. The rest of this Blog Article talks about what and why.
AI agents for the smart home - Home Assistant.

Thanks, so I just installed the wrong system. I will work through it and try to figure out.

1 Like