hello , you are trying to use deepseek api keys with openai url ?
I didnāt find a way to change the url in the fork. But the original component now has it, so Iām using that
1 Like
Thatās it, it should work now.
For some reason home assistant developers keep rejecting any PRs trying to add an easier option to switch the OpenAI endpoint in the official integration
- Install hass-environmental-variable
- Add this to your configuration.yaml:environment_variable: OPENAI_BASE_URL: āhttps://api.deepseek.com/v1ā
- Restart your system and add the OpenAI Conversation integration, when asked for the API key use the one you crated for DeepSeek
- Open the integration and uncheck āRecommended model settingsā
- Set āmodelā to ādeepseek-chatā and increase maximum tokens to 1024, then reload the integration
taken from reddit
2 Likes
You may also host the DeepSeek-R1 model free at your Azure AI Foundry account and connect it using the Azure OpenAI Conversation github custom component from HACS store.
Important Configuration:
- The base URL must include the api version (You can check this by navigating to your model + endpoint section and select your deployed model) e.g. https://ai-xxx.services.ai.azure.com/models/chat/completions?api-version=2024-05-01-preview
- After finish setup the service, you must click the
CONFIGURE
button and uncheck therecommended model settings
. Clicksubmit
and configure again you will be able to change the model name to your deployed DeepSeek model (e.g. DeepSeek-R1-home) . - Then, go to
Settings
>Voice assistants
. You can then add a new assistant and select theAzure OpenAI Conversation
at the Conversation Agent dropdown. This is the default name, you can change it at its integration section. - Happy chatting
1 Like