How to see raw LLM requests/responses while using Assistant

Hi,

Is there any way to see what the raw request and response is when you interact with the Assistant? Basically I wanna see the exact system prompt (not just what I typed in) + metadata (entities, etc) that is sent to the LLM (OpenAI or Gemini), and the exact raw response.

I’m trying to troubleshoot an issue where the Assistant sometimes refuses to call scripts that perform camera analysis using LLM Vision, so it just makes stuff up.

For voice assistants in general
Settings > Voice Assistants > (pick yours) > debug.

But for llm vision you’ll want to look at the trace for that script.

Hi Nathan - so how would I go about extracting some of the information in the debug log? I’m particularly interested in accessing the response type so that I can TTS a custom response to a media player other than the voice assistant that received the command. If that makes sense… :thinking:

Kindaaa? So the Voice agent (VPE whatever) wakes? Open Mic event - voice pipe start?

If X send output >>>> Somewhere?

At the moment my voice agents are silent. I am using custom sentences and intents only, and the intents direct TTS to the local Sonos speakers.

When a command is not understood, this means nothing happens. To start with at least I would like to be able to access the response type and if it is “error”, use it as a trigger to send a TTS “Sorry, didn’t quite get that” to the Sonos.

So it would be something like…

Voice agent wakes

Spoken command sent to speech recognition engine

    Intent recognised - execute command
                      - TTS "blah blah blah" to Sonos

    Intent not recognised - TTS "Sorry, didn't quite get that" to Sonos

There seem to be all sorts of usable hooks in the conversation process, but I don’t know how to get at them.

Neither one of these provides the raw LLM request, I’m basically looking for the HTTP payload that was sent to and received from the LLM API (OpenAI or Gemini). I’m looking to see what the exact system prompt is, since “Instructions” is just a part of it.