Hi all
I restarted my HA server after some integration updates (nothing to do with OpenAI). After then I keep getting errors with my conversation assistant.
I’ve read a bit on others having this issue but most seem to stem from an incompatible model being used. I’ve run through some problem finding steps with OpenAI (ensuring the used 4o-mini model is allowed and account still functioning) and also attempted to use other models but get the same result. I tried the extended openai conversation agent and that is working but doesn’t seem as reliable as the standard (e.g. can’t dynamically set a timer by listing the target time instead of duration).
I don’t know what to do next. Can anyone help?
Thanks
A
Thanks for the help Nathan
I’ll look for that Fridays Party post and see if I can find something.
This is what I got from the logs (hoping I got the right ones):
2025-10-01 15:40:24.643 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1267, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 112, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 55, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 68, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/openai_conversation/conversation.py", line 84, in _async_handle_message
await self._async_handle_chat_log(chat_log)
File "/usr/src/homeassistant/homeassistant/components/openai_conversation/entity.py", line 534, in _async_handle_chat_log
stream = await client.responses.create(**model_args)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.13/functools.py", line 1026, in __get__
val = self.func(instance)
File "/usr/local/lib/python3.13/site-packages/openai/_client.py", line 572, in responses
ModuleNotFoundError: No module named 'openai.resources.responses'
Yeah that’s something wrong at the setup. Ok let’s do this can you rebuild the connection to openai in the integration then setup one conversation agent and accept only the default. Switch your assist to use that agent and fire it up. That’s bare bones minimum if that works it’s something unique to your config of thag conversation agentyor your prompt. If not it’s more fundamental
Ok. I deleted everything I had in the OpenAI integration. I added a new one and set a new API in openai. Unfortunately, same result . Yet the connection is working through the extended conversation agent. Do you have any ideas I can research?
2025-10-03 18:02:36.612 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1267, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 112, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 55, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 68, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/openai_conversation/conversation.py", line 84, in _async_handle_message
await self._async_handle_chat_log(chat_log)
File "/usr/src/homeassistant/homeassistant/components/openai_conversation/entity.py", line 534, in _async_handle_chat_log
stream = await client.responses.create(**model_args)
^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.13/functools.py", line 1026, in __get__
val = self.func(instance)
File "/usr/local/lib/python3.13/site-packages/openai/_client.py", line 572, in responses
ModuleNotFoundError: No module named 'openai.resources.responses'
Its possible that this is a bug in the openai python package. The package is there from the logs (it found _client.py), but a call within the package is trying to import a method within the package and can’t seem to find it. May want to post the error to HA core just to let someone know and maybe they can comment on it.
BTW, I tried this on my system and don’t get this error, so not sure why you’re hitting this error.
Thinking aloud… Do you have a lot of scripts or intents registered in your system?
(you can get a variant of this error if you have
more than128 tools (scripts+intents) registered
if any single tool overflows (contains more than) 1024 chars in description
or non json safe characters / structures in same
or more than 128 chars in the description of any one of its input fields
or if you have invalid data in the intent_scripts section of config (or any files included under it)
Basically anything that functionally borks ‘Assist’ as it comes up… Returns that.
The fact that custom OAI works leads me to believe it’s one of those things unique to the core integration. And that is how they handle tools core is way different than the custom OAI one so I’d look there.
I’ve got less than 10 scripts, but 91 automations if that contributes as well. I’m not sure what is classed as an intent. Sorry, still a noob when it comes to this stuff.
I don’t think I have anything with that many characters but would need to check and not sure on how I could do that easily. And nothing that I’ve added since I restarted and started experiencing this problem.
haha… this gift unbug (love the saying) didn’t last long. But I did notice a pattern. I installed 2025.10.1 as I always wait for the .1 to play it a bit safer. After I restarted, the same symptoms happened. So I did another restart and then it started working again. It seems very temperamental around coming up right after restarts. But at least I have a potential way of trying to resolve until I get it sorted properly.
Although I now get anxiety every time an update comes out where I may need to restart
Ok mine did that for a bit and I couldn’t find it I’ll run through my notes and try to figure out what I was doing when that was happening… It was strange.
It’s a bit hit and miss. I still have the problem occasionally but just haven’t had time to really get into it and fault find. I did move the VPE which helped so I think there was a bit of a WiFi dead spot where it was and was causing a lot of the errors. I also have both the openai conversation agent integration and extended conversation agent one, each on their own wake word. So if one doesn’t work, I use the other. This has solved most problems but is not a proper solution and I still don’t know why it just doesn’t work occasionally.
I have this issue again, was fixed in 2026.1 then for no reason i can find it borked again. Can’t put my finger on when though, if it doesn’t work we go back to Google LOL, not helpful but too busy to look into things immediately.
just as an FYI, this is happening to me with GPT and HA Cloud with no AI