From A to Snips - HA on Hass.io + external Snips guide

Hi all,

I just implemented Snips successfully and want to share :slight_smile:

What is Snips?

Snips is a voice assistant that can run local on a Rapsberry Pi 3 (and maybe other systems). This way Google will never know when you switch your light on :wink:

Fund out more here: http://snips.ai

Snips is also what moved me here from OpenHAB. I never knew about Home Assistant. Then I tried to integrate Snips with OpenHAB and found that most Snips packages in the german store a for a thing called “Home Assistant”. I thought if there are so many plugins here for this “Home Assistant” thing, it might be worth to take a look :smiley:

Why this guide?

There is a lot of good instructions on the net and even in this forum, however a beginner can easily get confused (at least this was the case for me) or it covers only a part of the setup or is leaving you with very basic instructions on how to create intent scripts that are not very usefull to newbies. Here, I want to show all steps. However, I will often link to other How Tos, the focus of the guide is on the integration between Snips and HA, and have them actually doing stuff for you :slight_smile:

The main thing I learned is: Once you understand how the single components work together and how the intent scripts work using Snips is really easy and the quality of the assistant is amazing!

Assumptions & requirements

I am using Home Assistant on Hass.io so some instructions will only work there (like MQTT installation). If you are running on a different installation, these steps might be different for you.

My Hassi.io Home Assistant runs on a Raspberry Pi 4, 4GB (doesn’t really matter if you run on another one here I think).

I tried to install Snips on a Pi 4 before but failed. Debian Buster has some missing dependencies and Debian Stretch won’t run on a Pi 4. Then I tried to install Hass.io on that second Pi 4 and Snips as an addon, but that brought me many errors, too. I then bought a Raspberry Pi 3 B+ to run Snips

So far I implemented the control of Covers and Hue Scenes in some rooms. That is quite incomplete regarding variety of intents but it is working and you will find everything to go on by yourself.

As a microphone I use a Playstation Eye USB camera. You can get them very cheap on eBay and they do a good job, out of the box, without additional drivers.

For sound feedback you will need a 3,5" speaker, I am using a headphone right now.

Preparations

We will need a MQTT server. HA has an integrated one and Snips also, but I wanted to do it right and install a separated one on my hass.io:

1) Create a user in Home Assistant, called mqtt_login.
2) Go to the Hass.io Addons and install and start MQTT.
3) Add the following to your configuration.yaml:

mqtt:
  broker: localhost
  port: 1883
  username: <mqtt user>
  password: <mqtt password>

snips:

and restart Home Assistant.

(I think thats not the right place for passwords, something I have to learn about how to fix that).

4) Install Stretch and Snips on the Raspberry 3. There are good guides on the Snips website, however one of them is the manual installation, the other one is the nice SAM way but misses instructions to install npm. I took that from another HowTo, where somebody integrated Snips with OpenHab2. If you prefer a german guide, you can follow this guide until the installation of Snips is finished (stop before installing the assistant).

For english instructions: Follow this guide until Stretch is installed, then install npm and node.js with

sudo apt-get update
sudo apt-get upgrade
curl -sL https://deb.nodesource.com/setup_10.x | sudo -E bash -
sudo apt-get install -y nodejs

and then continue with Step 3 of the guide.

Note: I installed npm on the raspberry (as in the previous steps) and used it from there, therefore ignored that sentence in the guide: “For the remainder of this guide, all commands are run from your computer, not on the Raspberry Pi.”

Don’t forget to plug in your USB microphone and a 3,5" speaker or headphone to the Pi 3 :slight_smile:

If you followed the english guide you might find these commands from the german one useful for setting up your audio devices (but the Playstation Eye and a 3,5" might work right away without those steps):

sam setup audio
sam test speaker
sam test microphone

5) Enable Snips to use an external MQTT broker (the one we installed on Hass.io):

edit /etc/snips.toml on the Snips Raspberry

uncomment the mqtt host and port as well as the username and password section and enter your username and password.

Here my mqtt config:

[snips-common]
bus = "mqtt"
mqtt = "<mqtt-host>:1883"
# audio = ["+@mqtt"]
# assistant = "/usr/share/snips/assistant"
# user_dir = "/var/lib/snips"

## MQTT authentication
mqtt_username = "<login>"
mqtt_password = "<password>"

6) Create an app and assistant

Now in most tutorials there comes a step where you install the assistant. This is where my problems started. They mostly want to install some package from the store. But that might not be available in your language or not at all anymore and then you are lost with the examples that follow.

I learned that this is absolutely unnecessary. The prepackaged stuff might be useful for more complex things but to control Home Assistant you can better create your own app, so you will also know the intents without having to learn about them and you will learn a lot about Snips. Plus I found it somewhat odd that the existing Apps had some stuff that asked me for the Hass host, a token, a dictionary etc. I don’t know why beause I get the intents perfectly through MQTT and can work with them without such stuff. It is also better to create your own app as you can make the slots fitting to what your own scripts expect instead of fitting your scripts to someone else’s app


So we are going to create our own app, intents and assistant now:

  • If not done yet, go to http://snips.ai and create yourself an account for the Console.

  • Create a new assistant, give it a name and select your language.

  • Click “Add an App” and “Create new App”.

  • Enter a name and click “Create”

  • As a start, I created intents to open covers (OeffneRolladen), close covers (SchliesseRolladen) and select a hue scene (AktiviereSzene). We will later use very flexbile intent scripts to deal with them.

I will later publish my very basic app to the store for reference. For now screenshots should work:

As you can see I typed three sample sentences (more are better!) and created a Slot whichCover of type default. I then marked the word I expect to end up in the “whichCover” slot in the end and selected whichCover accordingly. This is very flexible - you can give Snips on sample sentence with the kitchen, one with the living room and the first one will work with “living room”, too. Snips is somewhat intelligent here.

Don’t forget to click save!

NOTICE I currently have a problem with german characters like Ă€ ö ĂŒ - Kitchen is “KĂŒche” in german and it seems that Home Assistant will not accept the ĂŒ in the slot value. When I change to a word without ĂŒ, it instantly works again, however Snips is always passing KĂŒche as KĂŒche and never as Kueche. Will work that out later


I did the same to open covers:

(it is possible to put open and close in one when altering the scripts we create later, this might happen in a later optimization step, for now it’s sufficient).

And I created one more to activate Hue scenes, this time with two slots:

All the intents:

The assistant:

Update on slot type When adding your slots for rooms (or scenes maybe), just create a custom slot type (i.e. “rooms”). There you can enter all your rooms including synonyms. Now you don’t need to create example sentences for each room as Snips can understand all the rooms in your custom slot. Plus you can reuse this custom slot in the Hue Scenes script (where a group is mostly a roomor area) or any other intent that requires room names.

7) Install your assistant to snips. This is very easy:

Click deploy assistant in the Snips console, copy the SAM command it tells you in the Popup and run that on your Snips raspberry. It should do some ASR training and download the agent.

You can now enter

sam watch

And then you can already say “Hey Snips”. You should hear a pling plong sound and then you should be able to speak a command. It should be one of your example sentences. However, Snips is very good in detecting your intention and you can speak very natural, it will most likely still fit. If not, just add some more example sentences to the intent.

You should now see Snips analyzing your sentence, identifying the intent and populating the slot. Nothing will hapen of course as Home Assistant does not know what to do with it. We will change that in the next step.

8) Create intent scipts. I did this in a separate file. Here you can easily see, how the Snips intents and slots are used in Home Assistant.

Credits: These scripts where inspired by this guys post on the Snips forum: https://forum.snips.ai/t/integrating-snips-with-home-assistant/211/3

configuration.yaml:

intent_script: !include intent_scripts.yaml

intent_scripts.yaml:

  SchliesseRolladen:
    speech:
      type: plain
      text: 'OK, {{ whichCover }} Rolladen werden geschlossen.'
    action:
      - service_template: cover.close_cover
        data_template:
          entity_id: cover.{{ whichCover }}
  OeffneRolladen:
    speech:
      type: plain
      text: 'OK, {{ whichCover }} Rolladen werden geoeffnet.'
    action:
      - service_template: cover.open_cover
        data_template:
          entity_id: cover.{{ whichCover }}
  AktiviereSzene:
    speech:
      type: plain
      text: 'OK Szene {{ whichScene }} wird aktiviert.'
    action:
      - service: hue.hue_activate_scene
        data_template:
          group_name: '{{ whichGroup }}'
          scene_name: '{{ whichScene }}'

I don#t wanted all the ifs and elifs in that code and make that easier. To make this working with the covers, we have to do a little trick with groups (attention: this is not going to groups.yaml but configuration.yaml, of course you could move it to another file):

configuration.yaml:

cover:
  - platform: group
    name: 'Schlafzimmer'
    entities:
    - cover.oeq....
    - cover.oeq....
  - platform: group
    name: 'Wohnzimmer'
    entities:
      - cover.oeq.....
      - cover.oeq.....
      - cover.oeq....

As you can see, I simply give the group the name I expect to be passed in the slot from Snips. That way we can just append the slot in the intent script and there is no need to write a single script for each cover. You can also create such a group for a single cover and call it i.e. “Wohnzimmerfenster” (“living room window”) and then add that word to the rooms custom slot (so in this case we misuse the cover group as an alias). Be aware of upper/lower case when working with intents and slots!

For the Hue Scene activation, you can request Snips to activate any scene thats available in a room by submitting the room name and scene name through a voice command. However, you can’t switch the lights off like this as there is no “Off” or “Aus” scene. That can be created using the CLIP tool on your bridge. How to do that is described here:

https://community.home-assistant.io/t/hue-scenes-mimicked-in-ha-and-on-buttons-was-tiles/93570?u=horizonkane

Also scroll down and read my stupid questions, that got nicely answered by Mariusthvdb to avoid making the same mistakes than me!

Thats it! After restarting Home Assistant, everything should work now. I hope with the scripts and Snips app creation included, this guide covers some gaps I experienced when trying to understand how all this works together.

Of course I will add the missing codes and configs later and publish the Snips app after I put some more stuff in there.

I am also currently installing mimic TTS as the original one is not that fancy. Will report back about that, too.

Also on my list: Make myself familiar with slot types and custom slots, seems that they are able to reduce the number of necessary example sentences and words when used right.
Updated that topic in the Snips assistant part.

Have fun and let me know for any problems, questions or improvements :slight_smile:

Edit: As this is currently growing very fast, I plan to publish my HA App and the corresponding intent_scripts soon
 maybe in a week or two. German users will than only need to create their own groups that fit the names or edit it slightly.

21 Likes

I must say that (without having time to read it all right now) this looks very thorough. Many thanks.

2 Likes

Yeah I did it yesterday and thought to write it down before I forget everything, to help others having the same questions than me :slight_smile:

I was looking on the web for a good guide to setup snips and integrate it to Hassio so i’m definitely
going to follow your guide. Thanks for taking the time to write it down. :+1:

1 Like

Love to hear that my effort is already helping somebody :slight_smile:

Hi!

Well first THANK YOU!

Ad proton999 told before I was looking for a guide to move from Google Assistant to Snips ( avoiding cloud and maintaing WAF as high as possible).

Morevore few days ago Project Alice have been released:

Have a look!

2 Likes

That looks really cool. I will definitely have a look at a later time, but for now I will continue to use my very slim and basic setup as creating things myself is the best way to learn a lot about Snips and Home Assistant :slight_smile:

Basically the availability of pre defined modules and that most HowTos make use of them was one reason that prevented me from understanding some important things about Snips and how it interacts with Home Assistant.

Updated the guide with Snips custom slots - very useful :slight_smile:

Hi HorinzoKane,

first of all, thanks for that tutorial. To be honest, I got into the same trouble you have been in and I am still trying to solve it. Probably you have an idea how to do.

So my situation is:
0. Hardware: Snips Tool Kit (pi 3b+ with all the other parts)

  1. I installed SNIPS based on the way how it is done on the snips installation guide: using Etcher to burn the image on SD, installing npm on windows, using SAM to install snips on the pi and so on.
    Everything is working, Snips is running and so on.
  2. After I installed python3.7 (I saw later, that 3.7 was already installed on the pi, but it was too late)
  3. I installed home assistant and it is running, exept that I do not get the hassio/dashboard and store running on the pi. After I configured the mqtt and from my prospective it seems to run.
    But currently it seems that I only the standard snips app (Temperature and Humidity) is running but not the home assistant app.
    My problem is, I do not know if it is regarding the code in the snips-apps or the connectivity between snips-hass.

On a 2nd SD I tried the other way round:

  1. Etcher with hassio
  2. Configuring hassio (Configurator, Terminal and SSH)
  3. Here I do not get the SAM connectivity to the pi. I know there is a problem with the key for the SAM connection but to be honest I got stuck
  4. Based on the missing SAM connectivity I - up to know - was not able to install the Snips system.

Any tips from your side. Probably we could also discuss it on pm in german.
Ciao
DJ

Hi Picasso,

You are trying something different than what I did. I installed Hass.io with Home Assistant on one Pi 4 and Debian Stretch Lite and Snips on another Pi 3. I don’t know the Snips Toolkit but I also had problems running the Snips App from Hass.io store when testing. It recognized voice but threw errors. I assumed it is because I use a Pi 4, but never troubleshooted that to the end.

The Addon Store is only available on Hass.io btw.

The point is: Do you really want to run Home Assistant and Snips on one Pi? I like my Home Assistant very stable and installing software there, messing with config files, running scripts to modify Snips or whatever is not exactly compatible to that need.

For that reasons I stopped trying and created the setup with two Pis, that way I can run Snips and HA both the way I like it most.

I will now order 2 or 3 Pi Zero to act as sattelite for my master Snips in other rooms.

If you like a german chat to discuss individual questions, we could easily send PMs I guess.

Well having ONE pi with HA+Snips will be cheaper. Moreover my project will be to have snips as server on the same machine running Hassio + many Snips satellite (pi zero for example) so I can dismiss all my minis around my apartment.

Okay, thats up to you of course.

I also failed in connecting to the Snips MQTT on Hass.io and got authentication errors.

To mitigate this you could try the following:

  • Install MQTT
  • Create a user for mqtt (in Home Assistant)
  • Update configuration.yaml and snips.toml (no idea how to get there in hass.io) as shown in my guide

I added my mqtt potion of snips.toml to the guide.

Hi HorizonKane,

to be honest, I did not think about running home assistant on another pi up to know, but you are right.
Struggeling days for getting it running it is more convenient to get another pi and run the system stable on different pis. And then I can configure each individually. I think I will try that.
Thanks und GrĂŒĂŸe aus dem Ruhrpott
DJ

1 Like

Yes it’s worth the money, saves a lot of your lifetime by reducing stress :wink:

GrĂŒĂŸe zurĂŒck aus dem Ruhrpott :smiley:

I had a big success - Snips gets my intent and deliver it to hass.io.
The problem now I can’t hear any voice.
Which one need to do the talks? snips or ha?

When i test each individually they do the talk very well

I get my reply from HA as I put speech in the intent_scripts:

intent_script:  
  addItem:
    speech:
      type: plain
      text: 'Ist notiert!'
    action:
      - service: shopping_list.add_item
        data_template:
          name: '{{ howMany }} {{ whichUnit }} {{ whichItem }}'

Ist that working for you? (If you want to use this example you need a shopping list and an intent with three slots, but you can try in any intent_script)

I plan to publish my HA assistant and all the intent_scripts as soon as I finished implementing some more stuff (thermostats and some other features) but it will only be helpful for germans I think ^^

A BIG success

 PlaySong:
    speech:
        type: plain
        text: 'OK, Playing song in {{ Location }}'
    action:
      - service_template: media_player.play_media
        data_template:
            entity_id: media_player.master_bedroom_player
            media_content_id: local:track:song.mp3
            media_content_type: music

I triggered a song with my voice.
The only thing that doesn’t work now is the speech part
Any suggestion?

And, can I short the waiting time for my next voice command?
I have to wait until I get this line in the log

[11:42:13] [Dialogue] session with id 'bf0448f5-5595-4ab8-9891-32e3d1dc52d9' was ended on site default. The session was ended because one of the component didn't respond in a timely manner

And last, how to change the entity to take the location dynamically?
When I put intent {{ Location }} it doesn’t work.
I guess I don’t know how to define the media player correctly

PlaySong:
    speech:
        type: plain
        text: 'OK, Playing song in {{ Location }}'
    action:
      - service_template: media_player.play_media
        data_template:
            entity_id: media_player.{{ Location }}
            media_content_id: local:track:song.mp3
            media_content_type: music

media_player:
  - platform: mpd
    host: 192.168.XXX.XXX
    name: master_bedroom_player

You can definitely change it, but be aware that you also have to talk faster then after saying hey Snips :wink:

I did not change that yet but I think you should find the necessary parameter here https://docs.snips.ai/articles/platform/platform-configuration

Any idea anyone why hass doesnt do the speech part for me?

When i trigger a tts test it works fine, so i know that the speaker is fine.

This means that Snips talked to Home Assistant but got no response that whatever was done successfully I think.

Is Location a slot in your PlaySong intent in the Snips assistant?
Did you maintain correct upper / lower case everywhere?
Did you remember to deploy the new assistant / update it on the snips Pi after changing stuff?

Can you please show the whole thing “sam watch” shows when you use that intent?