This integration uses the OpenClaw OpenAI compatible chat endpoint. In theory it can work with any other OpenAI style API, including Ollama and LM Studio self hosted models.
What it does
Adds an OpenClaw chat card you can place on any dashboard
Lets you send messages via service calls from automations
Supports optional tool-call execution for Home Assistant services
Supports wake word + always-on voice mode options when using Browser mode (for SST continuous audio is not supported)
32 languages supported in Browser voice API
Current highlights
Improved frontend auto-registration and resource handling
Better response parsing across different payload formats
Chat history restore after leaving and returning to dashboard
Better handling of speech recognition edge cases
Voice language selection now prioritizes Assist pipeline languages, then falls back on HA configured language
Installation
HACS installation (recommended), by manually adding the repository
Restart Home Assistant
Add OpenClaw from Settings → Devices & Services
Add the custom card to a dashboard
Notes
Voice input depends on browser speech APIs; behavior can differ by browser. For Assistant pipelines SST there are limitations.
If you see browser console errors from other custom cards/extensions, they may be unrelated to OpenClaw
In Brave Browser, Web Speech is missing or it can be less reliable in future, until properly implemented; there is now an integration option to allow experimental use but currently API is missing and still not working.
Feedback requested
I’d love feedback on:
Stability across HA versions
Voice behavior by browser/device
Context/tool-call defaults
UX improvements for the chat card
If you try it, please share your HA version, browser, and any logs/screenshots, especially if you experience some issues.
Thanks!
Thx for this plugin. I am not able to get this work with a remote instance of OC. I always get.
The gateway returned an unexpected response — the OpenAI-compatible API is likely disabled. In the OpenClaw addon settings enable ‘enable_openai_api’, restart the addon, and try again.
I already configured HAS so that it can access the remote instance via localhost, but that did not help. I also do not see a way to ‘enable_openai_api’ since this option is not in setup window and and I cant get past the window with the error message.
On your OpenClaw installation, which is remote one (not ran by the OpenClaw HA Addon) you have to enable the API. This can happen by running in terminal: openclaw config set gateway.http.endpoints.chatCompletions.enabled true.
Or by manual edit of the openclaw.json which I don’t advice you to do, as you may accidently break it.
Added HTTPS / SSL support for connecting to OpenClaw gateways running in lan_https mode or behind TLS reverse proxies.
Auto-discovery now detects access_mode: lan_https and connects to the internal gateway port automatically (no certificate setup needed for local addons).
Added Verify SSL certificate option in manual config for self-signed certificate environments.
Added ssl_error config flow error with actionable guidance.
Added comprehensive remote connection documentation to README with setup table for all access modes.
Fixed
Fixed “400 Bad Request — plain HTTP request was sent to HTTPS port” when the addon uses lan_https access mode.
Event entities (event.openclaw_message_received, event.openclaw_tool_invoked) — native HA EventEntity entities that fire on each assistant reply and tool invocation result. Selectable in the automation UI without YAML.
Button entities — dashboard-friendly buttons for common actions:
Clear History — clears in-memory conversation history
Sync History — triggers a backend coordinator refresh
Run Diagnostics — fires a connectivity check against the gateway
Select entity (select.openclaw_active_model) — exposes the list of available models from the gateway’s /v1/models endpoint, allowing model switching from the HA dashboard. Selection is persisted in config entry options.
Coordinator now caches the full model list (not just the first model) and exposes it via coordinator.available_models.
Now the integration will provide much more user friendly access into automations without the need of raw YAML configure.
Connection works great to an external OC. Whats the easiest way to get hass_service_call to work? I would rather not install a third party skill at this point
I’m amused by Gerri (open claw) trying to get Jarvis (Assist) to turn on some lights. But I’m still unclear if I need to install one of the HA openclaw plugins if I want to do tool calling from openclaw. I haven’t decided yet how much access I want to give OC to HA. But I do want HA to keep OC up to date on some status info.
You don’t need the addon if you already have OpenClaw somewhere. You need to enable the tool calls in the settings of the integration so it enables the service calls ability of the integration.
Question where did you store the token and the IP address of the home assistant installation every time I check off and set things up it comes back and says please provide token and the IP address any help would be greatly appreciated thank you