Companion app for iOS 2024.5: Let me Assist you šŸŽ

Itā€™s in progress, one downside of it is that for now I couldnā€™t find a way to send audio samples in realtime to HA server, so it requires sending the full audio which makes it slower, but keep an eye on the next releases.

1 Like

2024.05 is still broken on my iPad. Iā€™ve factory reset and restored the ipad to iPadOS 17.5.1 on a iPad 6 (WiFi). I have no other apps running on my iPad and never had issues before this update.

When it has been sitting for a while and I tap a new dashboard page it hangs and doesnā€™t fully render the page. I either have to wait 30-45s or force quit and re-open. When I force quit and re-open everything works as expected and I can navigate through pages, but the same issue persists after it has sat for a while.

Iā€™ve taken a video of what happens which you can see here: PXL_20240530_000001942.mp4 - Google Drive

Instead of sending the audio from a watch to home assistant, could you use Siriā€™s speech to text, which works great for me on the ultra 2 watch, and then just send the text to HA? For me, the whisper speech to text is a lot worse than Siri.

Hello, I am experiencing the following issue in the iOS app when editing the dashboard. Both on my home network and on the mobile network, I get this error (see video). I am using the Nabucasa cloud link for external connection. Hopefully, you can resolve this?Videoā€™s

I want to use an alternative ā€œpipelineā€ or ā€œconversation agentā€ that is powered by openai. Currently this does not seem possible with ios.

Using shortcuts and Siri with ā€œHey Siri, Assistā€ it always seems to use the default HASS agent and not the new pipeline or agent i created in HASS that uses openai instead of the default. Even if i delete the original agent and pipeline and set the new one as default. There is no way to select which pipeline to run in the shortcut automation ā€œAssistā€ block. The shortcut fot ā€œAssist in appā€ has that option but that has no option for direct input from Siri; and also that shortcut causes an error when opening the assist in app via automations and Siri without an input.

My iOS Widget on my Lockscreen opens my HA Test system and not my Prod System. How can i tell the widget to use the Prod System?

Did you end up solving this? I have the same problem and thought I was going insane but it seems the Assist shortcut is forced to the Home Assistant assistant. Maybe this is to push people towards the cloud sub, which is fine if so, but something should notify us that this is the case. Iā€™ve wasted hours.

EDIT:

OK, I started a trial of HA Cloud and itā€™s the same. There is no way for me to just say ā€œHey Siri Assistā€ and then speak and have Open AI sort it out, controlling devices or telling me about the configuration of something I ask about. It just resorts to ā€œdumbā€ mode. e.g. if I say ā€œWhat is the coldest room in the houseā€ it rambles on about not knowing if any device called ā€œis the coldest room in the houseā€. However if I load up HA via lovelace and type that in directly, it tells me.

If I use assist to turn a light on and off it can manage that, but thatā€™s it. It canā€™t be that hard to have an iOS device and speak to it and get HA via OpenAI to handle it. All the parts are there.

I have not played with this feature yet so Iā€™m just shooting off the cuff, but if you long press the widget and go to widget settings, you can choose a pipeline. I wonder if that would help your issue. Again, this is a complete guess and itā€™s probably not related at all.

Hi, thanks for the reply. What widget do you mean? I canā€™t see any pipeline setting on either the Assist shortcut, Assist Button. Assist in App has it but that opens the actual App and then you have to speak. The problems here are if you say ā€œHey Siri, Assist in appā€ it opens the app alright but then errors out immediately.

If Assist and Assist button just did what they do but allowed you to specify a pipeline thatā€™d be great. I have a feeling that Assist and Assist Button only work with sentences, i.e. dumb speech.

I tried setting up a webhook to fire off a conversation and while that part works, you canā€™t send the repsonse back to the shortcut for Siri to say.

I am quite astonished that this type of feature is just so badly supported. Everyone who talks about it online has a different ā€œsolutionā€ and to date none of them work, or they provide only one part - the getting into the conversation but not getting things out.

Even the documented TTS notification doesnā€™t work. The HA app just pops up a notification with ā€œTTSā€ in the text when the docs say itā€™s meant to parse that as meaning thereā€™s another field (tts_text) that is to be spoken out.