M5Stack atom help please

Today I had delivered a M5Stack Atom. The box was marked ESP FW pre-installed for Home Assistant. I’ve gone through the whole install process and everything appears to be working fine - the device itself, ESP, Whisper, pipeline, openwakeword etc. I had one problem in the early stages which was as I don’t use Chrome for my browser it closed the active window when doing the ESP install which meant I had to do the ESP Home Connection settings manually (using IP address atm, until I work out the right way) - but I don’t believe that is relevant.

As per the install process I tried OK Nabu but nothing happened. Pressing the button does nothing either. I was expecting the LED to go blue. HA sees the button entity is pressed so it seems to be properly connected.

I’m on the latest HA releases.

I’m hesitant to factory reset until I’ve aired the problem here to see if anyone has a better idea!

Thanks in advance.

You can test the wake word under HA >> settings >> voice asssistant

Select your voice assistant and you can test there.

Also, you did train and set the wake word?

Thanks. I left the word as standard OK Nabu, funnily enough it has just responded to my voice (still not the button but that isn’t necessary)

It doesn’t reliably understand my intent but at least it’s responded, so problem resolved. I’m sure training will be required at some point, though other than setting up my own wake words I can’t at teh moment see how.

Just tried it again, it recognises OK Nabu from me but not my partner. It is a bit hit and miss about understanding my requirement, even if I say it super slowly and clearly. Does that improve with training? Or age?

No, no, the training is meant for you - you’ll learn, how your assistant will understand you and speak that way. :rofl: Trust me, this gets funny next time in the supermarket. :rofl: :rofl: Sorry, couldn’t resist! :upside_down_face:

I had a similar experience with my first tries, but it got better over time. One thing really is, you’ll learn how it understands you and you’ll try to change your voice or phrasing to that. Keep in mind, the system is built to understand you best, if you use just normal speaking. People tend to try to speak extra clear or things like that - in my experience its best, to talk to it like I’d talk to a person.

Another thing is where the ECHO is located. In my environment, it has a lot of “noise” to filter, if I turn it in the direction of the TV. 45° to the left or to the right do wonders.

I have another one in the office, that is located directly under my monitor, that one always reacts like I’d expect.

What I want to say is, try moving it a little around and see if that helps. Sometimes more distance is better. Sometimes the other edge of the table is good. And give yourself some time. Try and error. :laughing:

I spent some time trying to improve it yesterday and it is a little better but nowhere near as good as I want it to be. I’ll keep trying as I love the concept.

I think with Google Assistant you had to say some stuff to get it used to your voice, and so it could recognise between people. Is that required here?

The issue is the microphone in these, not the programming or how you speak.

The loudspeaker is naff too. I suppose as a step into voice assistant on HA it is good though, and better kit would cost a lot more.