Custom Integration: Ollama Conversation (Local AI Agent)

Thanks for your work @ej52! I was able to install ollama using docker, and integrate it with home assistant. However, when I try to talk to it using home assistant it does not work:
image

Any ideas for what I might be doing wrong? In case it is relevant, I’m running home assistant and ollama on a AMD Ryzen 9 6900HX(Up to 4.9GHz), Radeon 680M Graphics,8C/16T Micro Computer, 32GB DDR5. This is the docker-compose that I use to run ollama:

version: "3.9"
services:
    ollama:
        devices:
          - "/dev/dri/renderD128:/dev/dri/renderD128"
          - "/dev/kfd:/dev/kfd"
          - "/dev/dri/card0:/dev/dri/card0"
        volumes:
            - nfs-ollama:/root/.ollama
        ports:
            - 11434:11434
        container_name: ollama
        image: ollama/ollama
        environment:
          OLLAMA_GPU: amd-gpu

volumes:
  nfs-ollama:
    external: true 

Thanks for the help!

1 Like