So Mycroft Core was released today: https://mycroft.ai/introducing-mycroft-core/, for anyone that doesn’t know Mycroft is a speech based intelligent AI, similar to Amazon’s Alexa only Open Source.
I downloaded and built the Alpha on Ubuntu and although a little rough around the edges, it works great. I can even talk to it from a couple of meters away, just with my rubbish laptop mic.
It would be great to have a Home Assistant skill for this. I’m going to have a deeper look into the SDK later today and see what I can come up with. If anyone wants to help out I’d be very grateful since my time is very limited.
thanks for sharing this. i was thinking doing the same using Jasper. Is Mycroft relies connection to external service to work or everything is self-contained in our own local network?
Currently the alpha uses Google speech API, but keyword recognition is done locally, so only the remainder of the query is sent to the cloud. Espeak is used for TTS. They are working on their own versions of these components with their Mimic TTS system and the OpenSTT initiative, but they are not ready yet. Eventually, you should be able to run an OpenSTT server locally.
They also have a log in service, which seems to be there just to manage the API keys for the other services they use and make life easy for the user. You can use your own keys but then you have to generate each one.
Currently, this is able to turn on an off my lights, but it’s hard coded to only interface to lights so far. In order to configure it you need to add a snippet to your ~/.mycroft/mycroft.ini file:
Thought I’d just revive this post, since I’ve done some more work on the Home Assistant Skill and released version 1.0.0. The new version has been updated to work with the latest version of Mycroft and features fuzzy matching to HASS entity names. Please check it out at: https://gitlab.com/robconnolly/mycroft-home-assistant - bug reports and contributions are welcome.
I was already thinking about this. I think an “Announcements Skill” which exposed a TTS API along with a supporting TTS component in HA would do the trick. I’ll look into it when I have time.
To be honest, Mycroft is my preferred choice of AI. However, I have some issues with my installation that need to be fixed before it is truly usable to me.
The default TTS engine; Mimic is not as good as Google TTS. I was unable to change the TTS to Google.
The time it takes to response to my query is a bit too long. I hope it can be reduced. Perhaps installing the STT engine locally will help.
There is no indication that shows that it has captured the wake word and waiting for my query. A simple beep will help me knows that it is waiting for my query. Right, I have no idea did it captured my wake word and waiting for my query or not.
I will try to find the solution from the Mycroft forum but the forum seems to be quieter than the one we are having here.
I know you can change the TTS engine, but not sure if Google TTS is supported. The Mycroft site appears to be down right now so I can’t look at the docs.
A local STT engine would certainly help. OpenSTT is their effort to do this, but hasn’t progressed very far as yet. There is currently an open pull request to use KaldiSTT (https://github.com/MycroftAI/mycroft-core/pull/440), which I am going to try once merged. Also, using Google TTS as suggested above will slow things down more, since this is cloud based.
Yeah, unfortunately the Mycroft community is much smaller than the HA one. This makes sense since HA is very large in scope. I also get the feeling (as an outsider) that development is being hampered by the Mycroft team trying to ship the physical Mycroft devices to their crowdfunding backers. Perhaps once this is done we will see a lot more development on the software.
I just got my hand on an Echo Dot and I am truly impress with it. I can see it is years ahead of MyCroft. The voice sounds so natural, the LED ring lights up and follow my voice direction when it catches the wake word and flashes when it is processing my queries, the response time is very fast and it almost feels like I am talking to a real person.
It seems MyCroft has a lot to catch up.
However, apart from privacy issue, I wish I can customize Alexa’s voice and the wake word. These are the appeals Mycroft has over Alexa.
The main appeal is that I can customize Mycroft skills the way I want them, and add for example new keywords for stuff to trigger in HASS, I’m thinking that with Alexa that is harder to do, but I could be mistaken?
There is now a HASS skill for Mycroft, it doesn’t yet have thermostat and covers support but the developer says he will add it in soon.
There is a possibility to add snowboy for wakeword, it works very fast and accurate. Speech to text is of course done via Google (but its anonimized through Mycroft servers). And for text to speech its nice that its local (mimic its called). Overall its not bad, I think it would be more complicated for me to integrate Alexa with Mopidy, etc.