Integration with LocalAI

Hey everyone!

I think would be really awesome to see an integration with Home Assistant and LocalAI.

LocalAI (GitHub - go-skynet/LocalAI: Self-hosted, community-driven, local OpenAI-compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. No GPU required. LocalAI is a RESTful API to run ggml compatible models: llama.cpp, alpaca.cpp, gpt4all.cpp, rwkv.cpp, whisper.cpp, vicuna, koala, gpt4all-j, cerebras and many others!) is an OpenAI drop-in replacement API to allow to run LLM directly on consumer grade-hardware. No GPU, and no internet access is required.

There is already an OpenAI integration for home assistant, and as LocalAI is following the OpenAI spec, it should be already possible to integrate it. The only change required would be for the OpenAI plugin to also support specifying a “base url” to the user where to point requests to in the options.

What do you think?

I would love to see a local AI running at my home.
If its only adding the base url it shoud be a quick win feature.

1 Like