Mycroft AI

So Mycroft Core was released today: https://mycroft.ai/introducing-mycroft-core/, for anyone that doesn’t know Mycroft is a speech based intelligent AI, similar to Amazon’s Alexa only Open Source.

I downloaded and built the Alpha on Ubuntu and although a little rough around the edges, it works great. I can even talk to it from a couple of meters away, just with my rubbish laptop mic.

It would be great to have a Home Assistant skill for this. I’m going to have a deeper look into the SDK later today and see what I can come up with. If anyone wants to help out I’d be very grateful since my time is very limited.

1 Like

thanks for sharing this. i was thinking doing the same using Jasper. Is Mycroft relies connection to external service to work or everything is self-contained in our own local network?

Currently the alpha uses Google speech API, but keyword recognition is done locally, so only the remainder of the query is sent to the cloud. Espeak is used for TTS. They are working on their own versions of these components with their Mimic TTS system and the OpenSTT initiative, but they are not ready yet. Eventually, you should be able to run an OpenSTT server locally.

They also have a log in service, which seems to be there just to manage the API keys for the other services they use and make life easy for the user. You can use your own keys but then you have to generate each one.

If you want some more information the latest Linux Luddites has an interview with one of the co-founders: https://linuxluddites.com/shows/episode-78/

Well, here’s a start: https://gitlab.com/robconnolly/mycroft-home-assistant

Currently, this is able to turn on an off my lights, but it’s hard coded to only interface to lights so far. In order to configure it you need to add a snippet to your ~/.mycroft/mycroft.ini file:

[HomeAssistantSkill]
host = <HA hostname>
password = <API password>

Then start up the skill as per the directions here: https://docs.mycroft.ai/development/skills-framework

Thought I’d just revive this post, since I’ve done some more work on the Home Assistant Skill and released version 1.0.0. The new version has been updated to work with the latest version of Mycroft and features fuzzy matching to HASS entity names. Please check it out at: https://gitlab.com/robconnolly/mycroft-home-assistant - bug reports and contributions are welcome.

2 Likes

Good job. Thanks for sharing this. I’m wondering can HA send notifications to Mycroft so that it can read out the text using the same TTS engine?

I was already thinking about this. I think an “Announcements Skill” which exposed a TTS API along with a supporting TTS component in HA would do the trick. I’ll look into it when I have time.

1 Like

Thank you in advance. Now my dream of having Jarvis in my home is getting nearer.

You’re welcome.

Looking at the APIs, integration with Alexa flash briefing sources (of which HASS is one), looks pretty doable too.

You mean this one? Set Up News and Flash Briefings for Alexa - Amazon Customer Service

Will Alexa automatically speak out the input from HA without the need to say “What’s my Flash Briefing?”?

I am curious because I am getting an Echo Dot soon.

I was actually looking at this, which HASS implements for it’s flash briefing support. I just need to add support for it to Mycroft.

As far as I am aware: no, you can’t currently do that with Alexa/Echo.

There is no reason why we can’t do both with Mycroft, it’s just a matter of implementing the required skills.

To be honest, Mycroft is my preferred choice of AI. However, I have some issues with my installation that need to be fixed before it is truly usable to me.

  1. The default TTS engine; Mimic is not as good as Google TTS. I was unable to change the TTS to Google.
  2. The time it takes to response to my query is a bit too long. I hope it can be reduced. Perhaps installing the STT engine locally will help.
  3. There is no indication that shows that it has captured the wake word and waiting for my query. A simple beep will help me knows that it is waiting for my query. Right, I have no idea did it captured my wake word and waiting for my query or not.

I will try to find the solution from the Mycroft forum but the forum seems to be quieter than the one we are having here.

In answer to your questions:

  1. I know you can change the TTS engine, but not sure if Google TTS is supported. The Mycroft site appears to be down right now so I can’t look at the docs.
  2. A local STT engine would certainly help. OpenSTT is their effort to do this, but hasn’t progressed very far as yet. There is currently an open pull request to use KaldiSTT (https://github.com/MycroftAI/mycroft-core/pull/440), which I am going to try once merged. Also, using Google TTS as suggested above will slow things down more, since this is cloud based.
  3. Yes, I agree that is a problem. I found this pull request open for it: https://github.com/MycroftAI/mycroft-core/pull/472

Yeah, unfortunately the Mycroft community is much smaller than the HA one. This makes sense since HA is very large in scope. I also get the feeling (as an outsider) that development is being hampered by the Mycroft team trying to ship the physical Mycroft devices to their crowdfunding backers. Perhaps once this is done we will see a lot more development on the software.

1 Like

I just got my hand on an Echo Dot and I am truly impress with it. I can see it is years ahead of MyCroft. The voice sounds so natural, the LED ring lights up and follow my voice direction when it catches the wake word and flashes when it is processing my queries, the response time is very fast and it almost feels like I am talking to a real person.

It seems MyCroft has a lot to catch up.

However, apart from privacy issue, I wish I can customize Alexa’s voice and the wake word. These are the appeals Mycroft has over Alexa.

The main appeal is that I can customize Mycroft skills the way I want them, and add for example new keywords for stuff to trigger in HASS, I’m thinking that with Alexa that is harder to do, but I could be mistaken?
There is now a HASS skill for Mycroft, it doesn’t yet have thermostat and covers support but the developer says he will add it in soon.

Here are the community skills (I also developed a few): https://github.com/MycroftAI/mycroft-skills

There is a possibility to add snowboy for wakeword, it works very fast and accurate. Speech to text is of course done via Google (but its anonimized through Mycroft servers). And for text to speech its nice that its local (mimic its called). Overall its not bad, I think it would be more complicated for me to integrate Alexa with Mopidy, etc.