ChatGPT / OpenAI Integration: Allow different OpenAI API Endpoints

Would love to just specify the Endpoint / URL to pop in other LLM Hosters which provide an API compatible to OpenAI API like together.ai.

I feel this is very important.

There are very many services that provide an OpenAI-compatible API. That is not only fun to say, it’s extremely useful.

Together.ai is one of these services - they provide unbelievably inexpensive access to open source models that I can’t possibly host on my own hardware.

I’m sure an OpenAI Conversation integration can be generalized to an OpenAI-compatible integration. (Nearly) the only thing that needs to be configurable is the server URL.

there is also the fact that many (jan.ai and lmstudio that I have tried so far) self-hostable llm-tools provide openai compatible APIs rather than ollama-compatible ones, so if I want to use my beefy desktop PC to do the lifting while its on, I can easily run something much heavier than what my 5+ years old desktop-turned-proxmox-server can handle

and on top of that: ollama(the current self-hosted option) only has official support for a handfull of models, while something like jan.ai or lmstudio booth allows for a much wider range of models