We want Android Wear support! It would be awesome to interact with home assistant though an Android wear device. What do you guys think?
+1 on It!
What would you expose to AndroidWear though? Wink has AW support and the only thing they actually have on it is shortcuts. Really, anything else wouldn’t make sense; you’d be swiping all day just to read a sensor. I could see this for scenes and scripts maybe though.
Maybe shortcuts, scenes, notifications, and voice commands…
Voice commands on Wear are pretty limited: you can only start a specific activity, nothing more. So it would be at least two steps: “OK Google, start Home-Assistant”, followed by the actual instruction once HA is started on the watch. Not really convenient, sadly… Maybe check AutoVoice and Tasker?
I’m not really a fan about having 36 apps just to do a simple task… I just want it to work natively and transparently.
I’m attempting to accomplish this via HTML5 notifications. It lends itself well to the Android Wear form factor. Automation or script sends notification with 2 actions + received and dismiss callbacks.
Awesome! Can you make a short tutorial for us?
I’m in the same mode. Once we get something working we can share it here.
On my Android Wear device, I have the Bubble Launcher (https://play.google.com/store/apps/details?id=dyna.logix.bookmarkbubbles), which provides a convenient way to launch Tasker tasks, where the bubbles are just a swipe away.
Using the Minimal & Elegant Watch Face (https://play.google.com/store/apps/details?id=com.stmp.minimalface) you can also launch tasks directly from your watchface.
I just bought a watch, and all I wanna do is be able to open my garage from it so I don’t have to carry a garage door opener with me on my motorcycle
Well, you could do that with IFTTT’s Do button and a Maker call to HA’s API.
For that matter, if you’re opener is supported or has it’s own API you could even go direct and bypass HA altogether.
That was a great idea! Totally works with a Do button. I’ve got a Liftmaster MyQ, and they haven’t exactly embraced the openness yet, but it does work with the MyQ component that was built and I was able to make open/close Do buttons
Glad to help!
Hmm…don’t know if I’d be any good at writing a tutorial, but here is an example of one of the scripts, abridged.
alias: Door Knock trigger: platform: mqtt topic: 'smartthings/Front Door/acceleration' payload: 'active' condition: - condition: state entity_id: sensor.front_door state: 'closed' for: minutes: 3 action: - alias: push notify of front door knock service: notify.html5 data_template: message: 'We believe there is someone at the front door.' title: 'Front Door Knock' data: icon: '/api/camera_proxy/camera.entryway?token=1808669136' actions: - action: quick_unlock icon: '/local/icons/lock-open.png' title: 'Unlock' - action: spooky_response icon: '/local/icons/bullhorn.png' title: 'Dogs'
The action quick_unlock pops:
alias: Quick Unlock trigger: platform: event event_type: html5_notification.clicked event_data: action: quick_unlock action: service: script.turn_on entity_id: script.open_sesame
End result is that when there’s a knock on the door, I get a notification with a snapshot of the front door, and an option to either unlock it, or have speakers blare barking dogs.
The open sesame script just does some security checks and verifications, and calls lock.unlock on the target entity_id
So there is option to use Macrodroid and make http requests or maybe even MQTT toggles to HASS. There is also a very very awesome: “Home Companion” app for Wear, which supports Openhab, Fhem and pilight, but I’m sure some bright spark here could emulate those protocols to use it with HASS.
See here (it has toggles for buttons, toggles for themorstats and motorized shutters and stuff, and also “places” for sensor values): https://play.google.com/store/apps/details?id=de.stefanheintz.smarthome
I’m emulating it currently in node-red but its very clunky so I would need a proper programmer to do it :).
And for the one using it for the garage door there is “Cancello” app on github for android wear which is intended to be a “button”.
I can post how the “json” needs to look like aproximately (tho’ maybe some fields in there are redundant) for it to be displayed in the watch app. And the node-red flow so you guys see how it is behind the scenes with sending it to the watch…
Here is the source code of the android + wear app, if someone can shoehorn home-assistant support into it: https://bitbucket.org/schdef/home-companion/overview
I will update this post later with the json.
I forgot to post the json and stuff, will post it today hopefully :). Got an idea tho’ is there an abstraction layer available between HASS and Openhab/Fhem, since if that is available we wouldn’t need to do the changing of stuff inside node-red or inside the android app to support it. Any android dev here who would be willing to add support for HASS in that android app?
So basically the node-red flow looks like this, it uses a http in node (get request) and http response node i think, and that “original” node is a template node which injects into msg.payload, and that json node converts it to json i believe:
Here is how the code inside template looks like:
A lot of those fields are redundant and some of them are not getting displayed on the watch. I’d need to do a review of which of those get display, so we know which points in the json we need to trigger / attach with out sensor values and buttons/switches…