How do I troubleshoot the Ollama integration?

I have installed the Ollama component (core/homeassistant/components/ollama at dev · home-assistant/core · GitHub) and have it working such that when I open Assist and choose my agent to speak with, I am able to control HA such as “turn off the light”.

However not everything is working, sometimes things get hung, and I’d like to look into how the data is flowing under the covers.

How can I troubleshoot this? I’d like to see:

  • What text was input to the conversation agent
  • What was sent to Ollama
  • What was the reasponse
  • What was the response time
  • How can I further optimize?

I stumbled across a debug view of the agent in my searching but have now lost it and can’t find it. It showed the text sent to Ollama and response. Just finding that would be a good start.

But I also have connected a bot in Nextcloud talk that is sending input to the conversation agent. And it is not working well. I want to see things flowing end to end to figure out where things are failing. There are so many pieces involved, I’m not sure how to proceed.