Trying to get camera.play_stream working on my LG Smart TV, using Google Chromecast Ultra. Original video source is a Dahua IP camera, connected to Blue Iris. Camera definition in Home Assistant points to Blue Iris computer. Home Assistant displays snapshot images and video streams perfectly. Unfortunately, using the Developer Tools / Services menu to test camera.play_stream does not work properly on the Chromecast.
It’s evident Home Assistant is communicating to the Chromecast device, because the Cast icon appears, and the scroll bar across the bottom of the screen scrolls twice. Then it stops and the Cast icon is all that remains. Here are the relevant bits from configuration.yaml:
The Chromecast Ultra has a 192.168.1.xxx LAN address. This is white-listed in the Blue Iris server, so it should provide HLS video streams with no need for authentication by the Chromecast. Should I also whitelist some external WAN address for Google?
The Chromecast Ultra was automagically detected by Home Assistant, and appears in Configuration / Integrations as a ‘Google Cast’ device. Do I still need to add anything to configuration.yaml?
One last thing. Using VLC 3.0.8 Vetinari from my Windows laptop, the stream_source listed above casts perfectly well to the Chromecast, so I know the device is detected, configured, and working properly. Just not from Home Assistant.
Thank you for any and all assistance.
I’d really like for this HA feature to start working.
Looks like I’ll need to plug my laptop’s LAN port into the switch with the Chromecast Ultra and run Wireshark to see what’s being sent back and forth between Home Assistant, Blue Iris, and Chromecast. Believe me, when I find the source of this difficulty, it will be documented and posted here.
That was with HA 0.98.4. Behavior has slightly changed for the better with HA 0.98.5, but still no joy. The scroll bar rolls four times, and the cast icon no long reappears. Unfortunately, the screen is still black/dark.
I have this setup successfully working. I have it for a security cam monitor with a chromecast.
It isn’t the best solution, because every 20 minutes, the chromecast stops playing. I solved this using automations.
If you have any better way to display this to a chromecast, it would be really nice. I am willing to help testing.
I found what may be the source of my particular problem. Whatever inside Home Assistant calls pychromecast is not including my HA server’s TCP/IP port number in the URL passed to the Chromecast device. I’m still waiting to see if any dev picks up on the Github issue, or comments on it.
While I’m not using Blue Iris (using Hikvision NVR & cams w/ generic camera config), I had similar issues streaming to Chromecast devices (Nest Hubs) – both directly and via Google Assistant integration.
In the end I finally got it working by changing some video stream config parameters on the cameras themselves. The two that seemed to make the difference for me w/ Hikvision were using “high profile” vs “main profile” and setting SVC (Scalable Video Coding) to “auto” which was disabled. I also adjusted the iframe interval, but don’t think that had an effect.
Anyway, if you haven’t tried it yet, it may be worth messing w/ config on one of the camera streams. Before I did, the streams worked fine in VLC and HA via browser, but would never work via Chromecast. Now even my 1080p main streams work on the little Nest Hubs.
Adding the :8123 TCP/IP port number to my configuration.yaml file did not resolve the issue. I may have received a new update/fix this evening, but haven’t had time to try it yet. News will be posted if it suddenly starts working.
I’m amazed this topic is not getting more attention are that more people aren’t talking about this issue. Perhaps some people have their RTSP feeds streaming to Chromecast without issue?
I have used a docker container in the past to re-stream RTSP feeds as HLS but this no longer works. No problem I thought, as it appears the “Stream” component will handle this and on demand too, so as not to take any extra CPU when it’s not being used.
I can confirm this does not work though (I mean to the Chromecast not the Stream) and despite signing up for a Chromecast developer account I’ve only been able to debug from the pychromecast end. I’ve tried to use wireshark but all the conversations are encrypted. I have a working HASS.IO implementation with Lets encrypt on port 443 and this creates the correct https://hassiourl/api/hls/<guid>/playlist.m3u8 endpoint. It can even be viewed fine by VLC on a Windows desktop after being initiated from the camera.play_stream service. The chromecast however, looks like it’s about to play and then then screen just goes blank. I’ve tried different camera feeds and settings.
The debug log shows the playerState as IDLE, LOADING, BUFFERING and even briefly PLAYING but then goes back to IDLE with a detailedErrorCode: 100?? I believe this may point to the type which in the play_stream service is set as just “hls” (nothing else seems to be accepted, will have to check this) and in the debug log this seems to equate to “application/vnd.apple.mpegurl”. I know chromecasts are picky about CORS and I do have this setup to enable lovelcase streaming but have tried removing this altogether, adding wildcard does not seem to work or the local ip of the Chromecast, etc etc.
It’s also worth pointing out that I can send the RTSP streams (mainly hikvision cameras) to the chromecast using VLC or android IP camera apps, just not using home assistant or HLS re-streamers that have worked before as mentioned in the forums here. What is going on, I’m sure it’s something really simple but without wireshark or much clue from the pychromecast debug I’m stuck at the moment!
I’ve come to my limit on time at the moment to research this more, so thought someone else might have some ideas and if not, at least it might help quantify the problem for a later discussion or diagnostic?
I am also trying to cast my Hikvision cam with no success (from cast.home-assistant.io).
I am able to cast my other views, but when i try to cast my “door” view which is this picture-entity card:
aspect_ratio: 100%
camera_image: camera.front_door
camera_view: live
entity: camera.front_door
type: picture-entity
It has a blank screen with a gray bar showing the entity name and ‘idel’ (bad pics sorry). When i set it to ‘panel mode’ its just a white blank screen.
I changed the settings to what worked for you, but i am curious if i am missing something that let you be successful?
Thanks for your time.
Also what was the data/command you used to cast from the dev-tools?
I currently only use the cameras with cast devices in a few ways:
One is using the binary_sensor Hikvision feature for HA, where I cast a snapshot for 30 seconds when tripwire events occur. So that would technically be casting still images which are grabbed first.
Another way is that I have Google Assistant integrated with HA, so saying something like “show the front door” will stream that camera from HA.
Finally I have a few scenarios where a camera will be directly cast to a player using camera.play_stream
So technically I’m not casting Lovelace views that would include a camera stream card, I’m casting the actual stream or a still taken from it. My gut feeling is that you might be running into something related to that.
So it may be worth trying to cast one of your cameras directly to a media player first to check that out. This would be an example service call:
I would like to know how you saved a still image based on event (button press) and casted that to your media player… i think this is a better solution because the still image has all the info i need (whos at the door) and i am gussing it will cast faster.
My automations for that have evolved a lot, currently calling in my own custom Python scripts to abstract portions and allow for scripts to run simultaneously. So pasting everything I have currently would be a confusing mess.
But originally I had a pretty plain automation that basically would trigger based on a binary sensor like:
trigger:
- entity_id: binary_sensor.driveway_line_crossing
platform: state
to: 'on'
Then it would have an action that would save the image, cast it, then stop casting after 30 seconds like:
In a nutshell it would save a temporary image locally somewhere inside /config/www so that it’s web accessible. However you must also whitelist that directory for web access in configuration.yaml like:
Then that URL is used to cast the image to the device. I’m going off memory on the code above, so there could be quirks. But that’s the general idea. A couple other notes:
I added that fake timestamp parameter to media_content_id because (at least for my Nest devices) they were caching the image sometimes. That just makes them think it’s “new” each time.
I think the URL may need to be https, although I could be wrong. That may have been for the Google Assistant integration.
I would like to thank you very much for this help and time.
I have now managed to get my first DIY project working thanks to you. I build my own smart doorbells from ESP chips and relays, now I converted your example to python (AppDaemon) and they chime and cast the cameras to my TVs for 30s before shutting off.
Im really excited if you cant tell.
My next step will be to create these still images and send them to the companion app.
That’s awesome! I’m just wrapping up my first custom Home Assistant project also, adding support for my older lighting and security systems via HA’s MQTT option. It’s currenty adding over 200 entities for me to play with, so I’m pretty excited about HA right now also (migrating my house from SmartThings for fast, reliable local control )
If you haven’t already done it, sending images to the companion app is very simple. I’m use Pushover currently, but this is a previous sample when I used the companion app for iPhone: