Would love to just specify the Endpoint / URL to pop in other LLM Hosters which provide an API compatible to OpenAI API like together.ai.
I feel this is very important.
There are very many services that provide an OpenAI-compatible API. That is not only fun to say, it’s extremely useful.
Together.ai is one of these services - they provide unbelievably inexpensive access to open source models that I can’t possibly host on my own hardware.
I’m sure an OpenAI Conversation integration can be generalized to an OpenAI-compatible integration. (Nearly) the only thing that needs to be configurable is the server URL.