WTH there is not a conversation integration that let you use any OpenAI-compatible API

Hi,

as more providers of LLM are becoming available and use an OpenAI-compatible API it would be great to have a conversation integration that let you use them.

It should be something as simple as letting the user configure the OpenAI-compatible endpoint and would enable home assistant users to leverage different cloud (e.g. openrouter.ai) and local LLM models.

This could open a larger number of models and be more cost effective as cloud providers often have cheaper LLM options than the mainstream OpenAI and Anthropic.

Thank You!

Don’t forget to vote for your own…

1 Like

I was kind of surprised that this didn’t exist when I saw a lot of requests for one. I’m working on creating my own conversation integration (that one’s here, but it’s pretty specific to some use cases I have and not really stable enough for others yet) and wanted to start with just an openai-compatible version of the built in agent, so I went ahead and froze that as a repo here, if anyone’s interested.

3 Likes

Yeah, being able to integrate an LLM like ChatGPT or Claude in to my Hass, and be able to ask it for suggestions, improvements, automations and so on would be truly ground breaking!

Thank you very much! Your openai-compatible-conversation works like a charm, but it’s a big bummer having to keep it maintained.

This point frustrates me a lot. I pay for GitHub Copilot, which allows to be used as an OpenAI SDK (here), but I simply cannot leverage it in Home Assistant without this custom component because… because of what? That’s what frustrates me the most. There isn’t even a good reason for not baking this feature by default.

Anyway, openai-compatible-conversation works great with GitHub Copilot models.

2 Likes

Yep. It’s literally the same thing with an Azure pass through.

Same with any based on Openwebui.

The inability to change the endpoint in the default integration is frankly, bonkers.

1 Like

There is this one from November 2023: [Custom Component] extended_openai_conversation: Let's control entities via ChatGPT

To compare and for reference, the Ollama integration does something similar but for those that use same or similar API as Ollama:

1 Like

This is one of the two reasons preventing me from using the Open AI Conversation integration, the other being the project’s seemingly unfriendly attitude towards contributors.

I have seen multiple pull requests and issues attempting to address exactly this issue (which to some extent already shows the value of this feature).
However, most of them got rejected unilaterally by the project owner without sufficient explanation / discussion. (For example, PR #94051 was rejected by the owner with the justification that

This is making setup for users more complicated with no added benefit for the users. We’re not interested in this change.

The same theme repeated in several other pull requests and issues.

I understand that the value of a feature is subjective, something important & reasonable to me might not be to others, but it’s disappointing and concerning to me as a potential user / contributor that an integration’s roadmap is unilaterally dictated by one or few individuals without taking into consideration different perspectives from users.

The good news is that I found a seemingly perfect replacement: Extended OpenAI Conversation. So far it has been working pretty well for me. Git repo also seems to be quite acitve. Sharing just in case anyone is looking for alternatives.

1 Like

I just bumped into this issue, wanting to try using the very cheap deepseek api. It took me a bit to realize that the option was plain missing, because it feels so opposite to the spirit of home assistant to not have it.

I just gone through the rabbit hole of the rejected PR’s and the “reasons” given, and it’s just weird. I mean, look at the reasons given to close them:

  • “This is making setup for users more complicated with no added benefit for the users. We’re not interested in this change.”
  • “The OpenAI integration is meant only for OpenAI and cannot be generalized after the fact.”
  • “Hey, we’ve had this PR a couple of times now. I don’t want to merge this as it is an edge case and it makes configuring OpenAI for the normal use case more confusing.”

And all of them are just blatantly, obviously wrong. Having an optional advanced setting wouldn’t make it more difficult for the general user, and it’s clearly a benefit for the users, as they keep on making PR’s for the feature! And by the same token, it’s also clearly not an edge case.

And then the worst one, “cannot be generalized after the fact” - Like, it’s using the OpenAI API Spec, there’s nothing to change, that’s the whole point of supporting that api spec. It would literally be just changing the base url the calls are made to, no other changes. No generalization needed. Now, it might work worse, … or better. But it would work.

To paraphrase Charles Babbage, “I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a reply”

5 Likes

This is infuriating. I use OpenRouter, which lets me experiment with many different LLMs using the OpenAI SDK. I don’t have the hardware to run a dedicated Ollama instance, and I’m not fond of OpenAI.

The weird gatekeeping in this is just insane. Just put the endpoint input under an advanced section or mark it as “experimental” - it’s not rocket science.

3 Likes

Totally agree. BTW I used the fork change mentioned by miniconfig in this thread and it works with openrouter. But is going to be a pain to maintain. It’s really ra WTH the main component just doesn’t add an OpenAI api compatible conf.

2 Likes

First of all, I have a lot of respect for maintainers of HA and HA related projects.
I don’t want to, neither like to make blatant accusations out of thin air but the sheer lack of willingness to listen to any argument for such a simple change and remarkably bogus explanations thus make me suspicious of some shady business going in the back relating to this vendor lock-in.