Forked-daapd and media player services

thx @uvjustin for explaining the integration setting! ( I overlooked that )

So the delay is before the actual text is spoken. I enter the text in the Media Player TTS field and press the Play button, than it will take half a minute before the text is spoken. Afterwards the system is reset to normal, what is a good thing by the way :slight_smile:

Side note, I only have a reference in timing between the Forked-daap integration media player and the Chromecast integration media player. I do not have any other TTS capable devices at home.
So the Chromecast one is almost instant.
I would assume Forked-daap needs to take a bit more time, because it will stream to my receiver and it could be the receiver is still turned off of set to another channel. But it seems the amount more is way off.

Is there any documentation to all this? I would like to set up a simple TTS to HomePod, so that I can send alerts, when the alarm is pending and needs to be turned off? Glad to hear that all this worked made it into official Home Assistant core, but no data on how to use it.

1 Like

@davidlb @uvjustin can you explain where library:track:25522 is coming from? How do we actually play a file from our library in forked-daapd via home assistant?

1 Like

For anyone else, the way to play media is to look up the URI from the forked-daapd api. For example:

http://192.168.10.10:3689/api/library/albums/8029616960692122489/tracks where the number are the album id. That will return a list of tracks for that album. Then you can use that in a service call like this:

entity_id: media_player.forked_daapd_server
media_content_id: 'library:track:3'
media_content_type: music

@uvjustin I have the same problem where the volume is immediately set to 80 before the music plays.

Hi guys,

Not quite sure how to run the media player component. The examples I see are related to the media_content_id, but I don’t know where that is obtained.

Also, in the example above, there is reference to the library and the track, but not the album. How does the player know which track to play?

I’d like to queue up a radio stream. Anyone know where the media_content_id can be found for a a radio stream? It will play when I click on one of the radio stations I’ve set up under Music > Radio. Can I play that using the media player?

Also can entire albums be played, for example using something :album: instead of :track:?

Am at a bit of a loss to understand how the media player interacts with forked-daadp. I want to be able to script a track for playback.

From the api call I can see a track id from api/library/tracks/62 gives me:

{ "id": 62, "title": "Not Over Yet", "title_sort": "Not Over Yet", "artist": "Grace", "artist_sort": "Grace", "album": "Savage Meltdown Vol. 1", "album_sort": "Savage Meltdown Vol. 00001", "album_id": "6123511396315858213", "album_artist": "Gavin Campbell", "album_artist_sort": "Gavin Campbell", "album_artist_id": "1702076477525592772", "genre": "Electronica", "year": 1995, "track_number": 1, "disc_number": 1, "length_ms": 696064, "rating": 0, "play_count": 0, "skip_count": 0, "time_added": "2021-01-31T08:35:59Z", "seek_ms": 0, "type": "m4a", "samplerate": 44100, "bitrate": 275, "channels": 2, "media_kind": "music", "data_kind": "file", "path": "\/music\/itunes music\/Savage Meltdown Vol. 1\/01 Not Over Yet.m4a", "uri": "library:track:62", "artwork_url": "\/artwork\/item\/62" }

I gather the “uri” is used for track selection. Within developer tools / services I call media_payer.pay_media:

entity_id: media_player.forked_daapd_server
media_content_id: 'library:track:62'
media_content_type: music

I’ve tried this with and without “&” in the media_content_id field.

But the track does not play. Looking at the forled-daapd web-ui I see the player goes briefly to “shair” (assume that’s the shairport-sync pipe?). Then it resumes playing the previous track.

I am able to play play lists via the custom:mini-media-player lovelace card, so the connection seems to be working.

What am I missing? Is there another way to play a track? Say with MPD?

Any help appreciated.

The current version of the integration does not handle content_id and support only TTS. I created a fork for my use by removing TTS support and adding content_id management. If you have HACS, you can add this custom repo: https://github.com/davidlb/ha_forkeddaapd

1 Like

Great, I’ll check that out thanks.

Can’t it be added to the official integration?

@uvjustin is/was working on it to support TTS and manage content_id. I don’t know what is the status.

@uvjustin I definitely think this topic is worth reviving!

I ran into the same issue initially described by @davidlb , that is when calling .play_media causes activation of all zones, volume set to 80%. I realise this is not the original issue posted by @squirtbrnr, but in any case has been discussed in this thread in detail.

Now, I realize that @davidlb used his “quick and dirty” (his words :slight_smile: ) hack as a fork which he published, for which I am grateful, since this serves to solve the bug :+1:

However! why isn’t your own branch, which offers a more comprehensive solution (and also a browse_media functionality :exploding_head:) incorporated into core??

Personally, I’m not looking for browse-media functionality since my use of OwnTone is via pipe and stream URIs. So I guess if splitting this into two separate PRs would expedite its incorporation into core, it’s worth a thought. All in all, this would be very useful, and at least all the people involved in this thread would be interested in it!

Thank you for this great component :clap:

Sorry all, due to various reasons I was not able to keep up with this component. I’ve recently taken a look at it again and will try to update it over the coming month.

3 Likes

I’ve pushed a few changes to the following branch: uvjustin/home-assistant at update-forked-daapd (github.com). You can download the files from homeassistant/components/forked_daapd and place them in your local custom_components/forked_daapd folder. The initial changes include supporting the announce/enqueue service calls and updating the media browser functionality. Please try it out when you get a chance, and feel free to provide some feedback so I can further improve the component.

1 Like

I’ve tested the update as a custom_component as suggested.
For TTS the experience is pretty much in sync with this discussion thread, however for other Browse Media options (i.e. Radio streams) overriding the current owntone output on/off and volume settings is probably not what the average user would like to see.

Thanks for testing it out and leaving a comment.
Are you sure you’re using the update-forked-daapd branch? It should not be doing anything with the volume for the other browse media options - it only does that for the options that use the ATTR_MEDIA_ANNOUNCE flag such as TTS.

Just verified it with an .mp3 file residing in ‘/media’. On pressing Play in the media browser all 6 players defined in owntone are turned on and the volume cranked up. The kids love it :smiley:

I see. I realized that for some reason putting the folder in custom_components doesn’t override the integration. I think this is probably what you are seeing.
If you are brave enough, can you replace the contents of the original /usr/src/homeassistant/homeassistant/components/forked_daapd folder with the files from my branch? That should work. You will be able to tell that it worked by opening up the media browser with your Owntone instance selected as an output - in addition to the other media sources available, you should also see a bunch of folders with Owntone’s own media listed there.

This one has been solved here.
Adding the version tag to manifest.json helps loads the custom_component correctly. I can see the owntone folders in Browse Media. I’ll test the audio functionality in ~1h :sleeping_bed: :sleeping:

Oooh yes :slight_smile: