I am trying to run ollama (deepseek r1 14b) on my workstation and try to integrate it with home assistant which is hosted on a mini PC.
I have managed to create the voice assistant pipeline, but I am getting error “Unexpected error during intent recognition” every time I am trying to access the deepseek model.
Am I missing any settings setup? My goal is to use it to control my smart home and for local simple queries like (tell me a recepie of X, who was X, what is the weather like on Monday in X, etc.). I also have a voice assistant PE, but first I want to make it work from the browser/app.
Logger: homeassistant.components.assist_pipeline.pipeline
Source: components/assist_pipeline/pipeline.py:1165
integration: Assist pipeline (documentation, issues)
First occurred: March 8, 2025 at 11:50:58 PM (9 occurrences)
Last logged: 12:10:36 AM
Unexpected error during intent recognition
Traceback (most recent call last):
File “/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py”, line 1165, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
…<8 lines>…
)
^
File “/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py”, line 117, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/conversation/entity.py”, line 47, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 219, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 276, in _async_handle_message
[
…<4 lines>…
]
File “/usr/src/homeassistant/homeassistant/components/conversation/chat_log.py”, line 277, in async_add_delta_content_stream
async for delta in stream:
^^^^^^^^^^^^^^^^^^^^^^^^^^
…<45 lines>…
self.delta_listener(self, delta) # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 147, in _transform_stream
async for response in result:
…<18 lines>…
yield chunk
File “/usr/local/lib/python3.13/site-packages/ollama/_client.py”, line 672, in inner
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: registry.ollama.ai/library/deepseek-r1:7b does not support tools (status code: 400)
No list. You must look and determine if a model is capable of ‘tool use’… that’s listed on huggingface. Also I’ve found models less than 7b params just don’t cut it for HA.