Hi all,
Not sure where this question fits, but here goes. I’m a previous user of Mycroft (and creator of custom bits and enclosure models), the offline personal assistant that worked like Alexa/Google Home. I’m currently setting up an all-in-one HA device on a RPi5 with a 7" LCD, and the goal is to have similar hub functionality like timers, HA control, and a relay for unknown commands to a selectable API (SearXNG, Google, GPT, etc).
Since I last used Mycroft, HA voice has expanded a TON. That being said, the feature set is very focused on voice and not on an assistant interface (with timers, visual weather, etc). I’m considering two paths:
Use integrations with HA and something like OpenVoiceOS.
Or
Use HA voice and build a super-custom dashboard for timers and popping up controls for devices when their settings are changed via voice.
For the latter, I already have my HAOS dev environment compiling and a method to add a WebKit frontend with Buildroot for the LCD.
Thoughts on direction?