I am trying to run ollama (deepseek r1 14b) on my workstation and try to integrate it with home assistant which is hosted on a mini PC.
I have managed to create the voice assistant pipeline, but I am getting error “Unexpected error during intent recognition” every time I am trying to access the deepseek model.
Am I missing any settings setup? My goal is to use it to control my smart home and for local simple queries like (tell me a recepie of X, who was X, what is the weather like on Monday in X, etc.). I also have a voice assistant PE, but first I want to make it work from the browser/app.
Logger: homeassistant.components.assist_pipeline.pipeline
Source: components/assist_pipeline/pipeline.py:1165
integration: Assist pipeline (documentation, issues)
First occurred: March 8, 2025 at 11:50:58 PM (9 occurrences)
Last logged: 12:10:36 AM
Unexpected error during intent recognition
Traceback (most recent call last):
File “/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py”, line 1165, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
…<8 lines>…
)
^
File “/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py”, line 117, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/conversation/entity.py”, line 47, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 219, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 276, in _async_handle_message
[
…<4 lines>…
]
File “/usr/src/homeassistant/homeassistant/components/conversation/chat_log.py”, line 277, in async_add_delta_content_stream
async for delta in stream:
^^^^^^^^^^^^^^^^^^^^^^^^^^
…<45 lines>…
self.delta_listener(self, delta) # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/usr/src/homeassistant/homeassistant/components/ollama/conversation.py”, line 147, in _transform_stream
async for response in result:
…<18 lines>…
yield chunk
File “/usr/local/lib/python3.13/site-packages/ollama/_client.py”, line 672, in inner
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: registry.ollama.ai/library/deepseek-r1:7b does not support tools (status code: 400)
No list. You must look and determine if a model is capable of ‘tool use’… that’s listed on huggingface. Also I’ve found models less than 7b params just don’t cut it for HA.
I can’t find it on huggingface, but ollama allows filtering for it.
Unfortunately phi4-mini (the only Phi with tools support) produces the same error for me
I don’t know if this is actually the problem. I setting up my system again fresh, and I actually had Gemma3:27b running for hours and giving me answers with assist. I had asked it to give me the temperature difference between my office and basement and it was able to that with local Kokoro and Speaches as well.
I decided to see if performance would faster with gemm3 12b. Made the switch. It works without assist, but once Assist turned on, giving me this tool error. Now when I switch back to 27b, how it was working earlier, it no longer works.
Super odd that it can work for a period of time and then stop working…
I’m also getting errors. I tried with deepseek-r1 that was listed in the tools section of ollama website, and now qwek 7b, as recommended.
It’s a little hard for me to get logs since the assist runs on my Yellow but the Ollama runs in another machine.
So following models are working but can not controll your house:
qwen3:14b (downloaded)
mistral:latest (downloaded)
qwen2.5vl:7b (downloaded)
llama3.2:latest (downloaded)
Ok I found another topic which solved my problem.
The system prompt:
I used qwen2.5:7b and the following prompt:
Current time: {{now()}}
Available Devices:
```csv
entity_id,name,state,aliases
{% for entity in exposed_entities -%}
{{ entity.entity_id }},{{ entity.name }},{{ entity.state }},{{entity.aliases | join('/')}}
{% endfor -%}
The current state of devices is provided in "Available Devices".
Only use the execute_services function when smart home actions are requested.
Pay particular attention to and follow the "intent_script" in the conversation.yaml when I have given a corresponding instruction.
Do not tell me what you're thinking about doing either, just do it.
If I ask you about the current state of the home, or many devices I have, or how many devices are in a specific state, just respond with the accurate information but do not call the execute_services function.
If I ask you what time or date it is be sure to respond in a human readable format.
If you don't have enough information to execute a smart home command then specify what other information you need.
Only contact me if I have called you or if you should give me a notification from the smart home. You must not call me without a clear reason!!
If a device should be turned off, use the service "turn_off".
If a device should be turned on, use the service "turn_on".
If a device should be toggled, use the service "toggle".
If the user’s instruction is ambiguous or matches multiple entities, ask the user which device they meant before executing the action.
The user might ask about time, date, or the current state of devices. Answer politely and clearly, without using execute_services.
- spec:
name: execute_services
description: Use this function to execute service of devices in Home Assistant.
parameters:
type: object
properties:
list:
type: array
items:
type: object
properties:
domain:
type: string
description: The domain of the service
service:
type: string
description: The service to be called
service_data:
type: object
description: The service data object to indicate what to control.
properties:
entity_id:
type: string
description: The entity_id retrieved from available devices. It must start with domain, followed by dot character.
required:
- entity_id
required:
- domain
- service
- service_data
function:
type: native
name: execute_service