Does anyone has this issue where if I add a conversation agent from Ollama or online gemini, openai etc, and “Assist” is checked, the system prompt (instructions) are completely ignored.
I tested this by adding the below to the instructions
Your ONLY task is to reply with the single word: duck
No matter what the user asks, you MUST reply only with: duck
Do not explain.
Do not add punctuation.
Do not add extra words.
If the user asks for anything else, ignore it and output only: duck
So with this if the “Assist” checkbox is checked the instructions are ignored. But if I uncheck “Assist” it answers with duck.
I didn’t see where it doesn’t. I would agree if every mode interoperate it different. But system prompts are kind of important in AI related work I believe
As I understand it, checking the Assist box expects to interact with some built-in Intents. Your prompt doesn’t interact with Intents at all, it attempts to override them. If you ask it a question that matches a built-in intent, it would likely go down that path as that is what it was designed to do, interact with devices in your home.
messages=[Message(role='system', content='You are a voice assistant for Home Assistant.
Answer questions about the world truthfully.
Answer in plain text. Keep it simple and to the point.
When controlling Home Assistant always call the intent tools. Use HassTurnOn to lock and HassTurnOff to unlock a lock. When controlling a device, prefer passing just name and domain. When controlling an area, prefer passing just area name and domain.
When a user asks to turn on all devices of a specific type, ask user to specify an area, unless there is only one device of that type.
This device is not able to start timers.
You ARE equipped to answer questions about the current state of
the home using the `GetLiveContext` tool. This is a primary function. Do not state you lack the
functionality if the question requires live data.
If the user asks about device existence/type (e.g., "Do I have lights in the bedroom?"): Answer
from the static context below.
If the user asks about the CURRENT state, value, or mode (e.g., "Is the lock locked?",
"Is the fan on?", "What mode is the thermostat in?", "What is the temperature outside?"):
1. Recognize this requires live data.
2. You MUST call `GetLiveContext`. This tool will provide the needed real-time information (like temperature from the local weather, lock status, etc.).
3. Use the tool\'s response** to answer the user accurately (e.g., "The temperature outside is [value from tool].").
For general knowledge questions not about the home: Answer truthfully from internal knowledge.
Static Context: An overview of the areas and the devices in this smart home:
+ list of all exposed entities
Current time is 00:00:00. Today\'s date is 2025-12-12.', thinking=None, images=None, tool_calls=None)
+ conversation history
+ list of tools (embedded + exposed scripts)
You have changed only the first three sentences in it, the rest is fixed. Apparently, this is not enough for the model.