HA + MA no voice integration working, throws error

I’ve got Music Assisstant working. I have Home Assistant with Voice Assistant working. Unfortunately HA voice can’t trigger actions with MA.

Config notes: Running docker on Linux. Fully updated for both HA and MA. Integration is installed in HA, but I don’t get a side menu option for Music Assistant for some reason that people seem to talk about. Controlling MA devices using dashboard media cards works fine. MA finds my HA music players and lets me stream to them from the MA interface. I’ve followed the configuration options for MA to integrate voice assistant in HA by adding the two files in the ha config/custom_sentences/en/ folder from github as per instructions.

First what the voice assistant does. You can see it talks to devices fine, just not MA:

Screenshot from 2024-12-16 14-38-33

The HA log generates an “Unknown intent MassPlayMediaAssist” error:

2024-12-16 14:39:06.213 ERROR (MainThread) [homeassistant.components.assist_pipeline.pipeline] Unexpected error during intent recognition
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1078, in recognize_intent
    conversation_result = await conversation.async_converse(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<7 lines>...
    )
    ^
  File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 110, in async_converse
    result = await method(conversation_input)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 47, in internal_async_process
    return await self.async_process(user_input)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/default_agent.py", line 368, in async_process
    return await self._async_process_intent_result(intent_result, user_input)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/default_agent.py", line 441, in _async_process_intent_result
    intent_response = await intent.async_handle(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<10 lines>...
    )
    ^
  File "/usr/src/homeassistant/homeassistant/helpers/intent.py", line 121, in async_handle
    raise UnknownIntent(f"Unknown intent {intent_type}")
homeassistant.helpers.intent.UnknownIntent: Unknown intent MassPlayMediaAssist


Have I missed a step somewhere?

Can you post your config files?

What config files? Other than the two voice integration for MA, everything for this is configured through the gui. The only two files I did for this are the two from github as linked in the instructions here: Voice Control - Music Assistant

You added the custom_sentences.yaml from the MA github, but you will need to create a custom intent for triggering the music_assistant.play_media service. There are no built-in intents for music search, only media_player controls.

There’s nothing in the setup instructions for Music Assistant saying that you would have to do that, or exactly how to do that. It only talks about installing the intents file and the responses file and even calls that setup “Easiest”. I got the impression that the additional intent helpers are handled by the plugin. I’m getting the impression that Music Assistant isn’t quite ready for the average joe with only 3 years of HA experience.

Music Assistant did build out the ability to search all your providers content from HA with the music_assistant.play_media service. It all looks great and should work in theory, the automation seems like it would be straight forward, but as i posted earlier this morning I have no been able to get this work (im no pro HA user though).

https://community.home-assistant.io/t/custom-intents-and-music-assistant/811943

I started the same way you did by bringing in the music assistant provided sentence file and tried building out the intent based on the slots but i could not get it to work. Now I’ve eliminated all the bells and whistles and I still cant get this thing to play music from a voice command.

Well at least I’m not the only one. lol. With all the hype around it and nothing but positive vibes I was expecting it to work, yes I expect a bit of pain everytime I do something new with HA, but oh well. This is the last thing I was working out to be able to replace my Google Home infrastructure entirely with in-house.

I’ll let you know if someone rains down some knowledge or if i stumble into a solution.

I will say if you haven’t tried it yet the M5Stack Atom is awesome. The speaker is unusable on it, but I’ve ordered a dac board to give me a line out.

Then again the rumors of voice hardware coming…

I made some progress. Turns out the native MA plugin doesn’t include what’s needed for voice to work. Switching to the HACS version of it caused the error to stop, but now it gives me:

2024-12-16 17:10:07.091 WARNING (MainThread) [homeassistant.helpers.service] Referenced entities media_player.office_speaker are missing or not currently available

Tomorrow I might try ripping it all out and starting over with just the HACS plugin and see where I get, but clearly there’s a major difference between the two.

I thought about going that route, but they are put out a warning deprecating the HACS integration and moving everything over to Core after 2024.12. Let me know if it works well, but probably a temporary solution.

“Yes, in about a month.”

I’m having the same issue and I could not get it to work with either the HACS or the HA integrations

Now it’s working using the HACS component

I’m also trying to play tracks on my new Voice Assistant Satellite using voice and MA.

First I’ve tried it directly - to no avail.

Then at some point I moved back to the HACS integration. The best I could get with that was that it said it would now start the song, but it never actually started.

Moved back to the new integration and tried playing around with setting up an intent (my first one actually, so I barely have an idea what I’m doing here):

config/custom_sentences/de/music_assistant_PlayMediaOnMediaPlayer.yaml

Summary
language: "de"
intents:
  MusicAssistantPlayMediaOnMediaPlayer:
    data:
      # EINEN BEREICH ZIELEN
      - sentences:
          - "<abspielen> {query};im [der ]<bereich> [((mit)|(unter Verwendung von)) {radio_modus}]"
        expansion_rules:
          abspielen: "((spiel(e)?)|(höre))"

      # EINEN NAMEN ZIELEN
      - sentences:
          - "<abspielen> {query};<auf> [dem ]{name} [<wiedergabegeräte>] [((mit)|(unter Verwendung von)) {radio_modus}]"
        expansion_rules:
          abspielen: "((spiel(e)?)|(höre))"
          wiedergabegeräte: "((Lautsprecher)|([Medien-]Player))"
          "auf": "(auf|über)"
        requires_context:
          domain: "media_player"

      # EINEN BEREICH UND EINEN NAMEN ZIELEN
      - sentences:
          - "<abspielen> {query};im [der ]<bereich> <auf> [dem ]{name} [<wiedergabegeräte>] [((mit)|(unter Verwendung von)) {radio_modus}]"
        expansion_rules:
          abspielen: "((spiele)|(höre))"
          wiedergabegeräte: "((Lautsprecher)|([Medien-]Player))"
          "auf": "(auf|über)"
        requires_context:
          domain: "media_player"

lists:
  query:
    wildcard: true
  radio_modus:
    values:
      - "Radiomodus"

config/intents.yaml

Summary
intents:
    - spec:
        name: play_track_on_media_player
        description: Plays any track (name or artist of song) on a given media player
        parameters:
          type: object
          properties:
            track:
              type: string
              description: The track to play
            entity_id:
              type: string
              description: The media_player entity_id retrieved from available devices. 
                It must start with the media_player domain, followed by dot character.
          required:
          - track
          - entity_id
      function:
        type: script
        sequence:
        - service: music_assistant.play_media
          data:
            media_id: '{{track}}'
            media_type: track
          target:
            entity_id: '{{entity_id}}'

But no matter what I do, when not using the HACS version, it always tells me that the player is not in pause and can’t play anything. I assume that it is only able to pause and resume with the pre-existing intents. No idea why it is not picking up my custom intents.

I might add than when debugging the intent with the dev tools → assist, it states that it is using the intent HassTurnOn - that obviously won’t work. I’m telling it like “Spiele XY von Z” - “Play XY by Z”, which should be covered by my custom intent?

I would have assumed that this somehow works out of the box, but it does not and I’m puzzled how I can make this work. And why it seemed to have worked in the HACS integration but not in the new one?
Any help is greatly appreciated! :slight_smile: