Since you’re going to stream to a media_player device, you can simply turn off or stop the media_player indeed.
I do use the turn off, but nothing happens.
The following code doesn’t seem to work.
- alias: '[Frontyard] Cube Shake - Stream Camera'
trigger:
- event_data:
action_type: shake_air
entity_id: binary_sensor.cube_xxxxxxxxxxxx
event_type: xiaomi_aqara.cube_action
platform: event
action:
- service: media_player.turn_on
entity_id: media_player.chromecast_kitchen
- service: camera.play_stream
data:
entity_id: camera.drivein
media_player:
- media_player.chromecast_kitchen
- delay:
seconds: 30
- service: media_player.turn_off
entity_id: media_player.chromecast_kitchen
It does turn on the Chromecast and then the screen shows the cast icon, but that’s about it.
No stream and it certainly doesn’t do the turn off command. I can afterwards manually in my lovelace click turn off and then follow my command.
I see this error though in my log:
Error while executing automation automation.frontyard_cube_shake_stream_camera. Unknown error for call_service at pos 2:
Traceback (most recent call last):
File “/usr/local/lib/python3.7/site-packages/homeassistant/components/automation/init.py”, line 380, in action
await script_obj.async_run(variables, context)
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/script.py”, line 131, in async_run
await self._handle_action(action, variables, context)
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/script.py”, line 210, in _handle_action
action, variables, context)
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/script.py”, line 299, in _async_call_service
context=context
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/service.py”, line 88, in async_call_from_config
domain, service_name, service_data, blocking=blocking, context=context)
File “/usr/local/lib/python3.7/site-packages/homeassistant/core.py”, line 1138, in async_call
self._execute_service(handler, service_call))
File “/usr/local/lib/python3.7/site-packages/homeassistant/core.py”, line 1160, in _execute_service
await handler.func(service_call)
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/entity_component.py”, line 188, in handle_service
self._platforms.values(), func, call, service_name
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/service.py”, line 314, in entity_service_call
future.result() # pop exception if have
File “/usr/local/lib/python3.7/site-packages/homeassistant/helpers/service.py”, line 330, in _handle_service_platform_call
await func(entity, data)
File “/usr/local/lib/python3.7/site-packages/homeassistant/components/camera/init.py”, line 644, in async_handle_play_stream_service
keepalive=camera_prefs.preload_stream)
File “/usr/local/lib/python3.7/site-packages/homeassistant/components/stream/init.py”, line 56, in request_stream
raise HomeAssistantError(“Stream component is not set up.”)
homeassistant.exceptions.HomeAssistantError: Stream component is not set up.
But, since I added the Ubiquiti Protect Camera as a Generic as described in the guide, I shouldn’t use the stream component as far as I could read?
What have I messed up?
I get the same error as above - homeassistant.exceptions.HomeAssistantError: Stream component is not set up when trying to launch via automation.
I can display the still image in the web UI in a picture card and when I click it I get a pop up with the streaming video, so I know that the video itself does work.
Camera is a Hikvision set up as generic; tried H264 and mjpeg.
Have anyone had luck turning your TV off with the Chromecast. Currently when I use media_player.turn_off it just stops whatever that is playing on the Chromecast and then it goes back to the Screensaver. But if I say the command directly to google “Hey Google, Turn off Chromecast XXX” then it turns off completely?
Is that not possible through HA?
Trying to send the stream to Kodi gets me this:
Error calling async_play_media on entity media_player.kodi: TransportError("Error calling method 'Player.Open': Transport Error", TimeoutError())
So when I ask Google to stream to my Android TV one of my cameras this is what I’m getting any suggestions!
It should be something in your setup. Please add your yaml here to troubleshoot
Did you get this automation completed using the Harmony Hub? If so could you post it or send a link?
Thanks.
I actually never did. The concept was good, but in real life it was more annoying than needed for my setup.
use Kodi as the media player to watch movies. It will automatically pause the movie. and you can create an HA automation to trigger a sensor to play on kodi
Joining the conversation here, because I’d like to do this with my LG WebOS-based television. I’m not certain how to configure it to receive a network-supplied ‘cast’ stream.
Later edit: After researching the LG apps web site, I decided to use a Google Chromecast device on an HDMI input, instead. I have already confirmed I can automate the selection of a particular HDMI input, so the Chromecast should make it quite a bit easier.
Well, “should” was the key word there. Working with Developer Tools / Services, I am unable to get a stream cast to the new Chromecast Ultra 1517 device. I have several Dahua and Hikvision cameras around the house, but only four defined in Home Assistant. These devices do provide video streams through HA, so I’m frustrated at why HA doesn’t successfully direct the correct video stream to the Chromecast.
https://developers.google.com/cast/docs/media states these are the supported media types for the Chromecast Ultra device:
* H.264 High Profile up to level 4.1 (720p/60fps or 1080p/30fps)
* H.264 High Profile up to level 4.2 (1080p/60fps)
* H.264 High Profile up to level 5.2 (2160p/30fps max)
* VP8 (720p/60fps or 1080p/30fps)
* VP8 (2160p/30fps)
* HEVC / H.265 Main and Main10 Profiles up to level 5.1 (2160p/60fps)
* VP9 Profile 0 and Profile 2 up to level 5.1 (2160p/60fps)
If I use VLC 3.0.7 on my laptop, I can successfully cast RTSP video to the Chromecast Ultra. So I presume it’s either transcoding (doubtful) or re-wrapping (more likely?) the video.
Here’s the JSON passed to the Developer Tools / Services page:
service: camera.play_stream
{
"entity_id": "camera.driveway",
"media_player": "media_player.chromecastultra1517",
"format": "hls"
}
and this is the camera definition from configuration.yaml
camera:
- platform: generic
name: Driveway
username: !secret
password: !secret
authentication: digest
still_image_url: 'http://192.168.1.20:8899/image/Driveway'
stream_source: 'http://192.168.1.20:8899/h264/Driveway/temp.m3u8'
The blue bar spins once or twice, then stops. No video is displayed, just the screen/cast icon. The lack of any UI, error messages, or diagnostic messages is rather disappointing.
The very last thing I want to do is use a command_line action to run ffmpeg in the background. It worked on the old RPi3 Raspbian system, so I presume it should also work on the current NUC i3 Debian system. So, after installing ffmpeg in the Debian host system, this is the output of ffprobe on one of the RTSP streams:
Input #0, rtsp, from 'rtsp://username:[email protected]/cam/realmonitor?channel=1&subtype=0':
Metadata:
title : Media Server
Duration: N/A, start: 0.040000, bitrate: N/A
Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 15 fps, 25 tbr, 90k tbn, 30 tbc
Stream #0:1: Audio: pcm_alaw, 16000 Hz, 1 channels, s16, 128 kb/s
Discord user Skalavalas recommended the Docker package gihad/streamer and I set up three of the four cameras using this RTSP-to-HLS conversion tool. Still not working, even with a supposedly cast-ready stream.
Well, CATT was able to cast the m3u8 file created by streamer successfully to the Chromecast, but the video stream only lasted seconds, instead of refreshing and continuing. Success, even if somewhat limited.
Respectfully bumping for any and all assistance. I dropped the streamer angle, and went directly to Blue Iris web server for H3U8 compatible streams. These work perfectly on the Home Assistant Overview page. I can click on the camera snapshot, and can view the live stream from the Overview page. I can click on the stream header and get a larger version of the live stream.
So why doesn’t HA successfully pass this same stream information to the Chromecast?
/me scratches head and waits patiently…
I did even think about casting from VLC, but will try that to confirm your results. 4 laview (hikvision rebranded camera) that are hardwired to a NAS and I am able to cast them to both my Sony and Vizio TV’s without a problem. I have one wireless laview camera that does not cast properly on either TV, but do display just fine in VLC.
The wireless comes up but stutters every second, so it comes up on the screen, and then immediately buffers, then plays another second or so, and then back to buffer
@ptdalen Thank you! If you or anyone else can tell me where in the HA source code this stream casting occurs, I may be able to provide more details and/or debugging, including Wireshark capture of the communications (if necessary.)
Later edit: On the drive home from work, began thinking about Wireshark and realized, I had not whitelisted the ChromecastUltra’s IP address in my BlueIris webserver settings. Did so, restarted Blue Iris, but still no joy.
One last bump. If there are no responses, I’ll start a new thread. But since this was precisely about camera.play_stream, I thought it appropriate to post here.
Hello,
Getting following error when try to stream amcrest camera into roku TV
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/websocket_api/commands.py", line 134, in handle_call_service
connection.context(msg),
File "/usr/src/homeassistant/homeassistant/core.py", line 1230, in async_call
await asyncio.shield(self._execute_service(handler, service_call))
File "/usr/src/homeassistant/homeassistant/core.py", line 1253, in _execute_service
await handler.func(service_call)
File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 198, in handle_service
self._platforms.values(), func, call, required_features
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 402, in entity_service_call
future.result() # pop exception if have
File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 590, in async_request_call
await coro
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 433, in _handle_entity_call
await result
File "/usr/src/homeassistant/homeassistant/components/camera/__init__.py", line 676, in async_handle_play_stream_service
DOMAIN_MP, SERVICE_PLAY_MEDIA, data, blocking=True, context=service_call.context
File "/usr/src/homeassistant/homeassistant/core.py", line 1230, in async_call
await asyncio.shield(self._execute_service(handler, service_call))
File "/usr/src/homeassistant/homeassistant/core.py", line 1253, in _execute_service
await handler.func(service_call)
File "/usr/src/homeassistant/homeassistant/helpers/entity_component.py", line 198, in handle_service
self._platforms.values(), func, call, required_features
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 402, in entity_service_call
future.result() # pop exception if have
File "/usr/src/homeassistant/homeassistant/helpers/entity.py", line 590, in async_request_call
await coro
File "/usr/src/homeassistant/homeassistant/helpers/service.py", line 433, in _handle_entity_call
await result
File "/usr/src/homeassistant/homeassistant/components/media_player/__init__.py", line 600, in async_play_media
ft.partial(self.play_media, media_type, media_id, **kwargs)
File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/src/homeassistant/homeassistant/components/media_player/__init__.py", line 595, in play_media
raise NotImplementedError()
NotImplementedError
Any assistance will be highly appreciated.
Thanks
NAT Loopback? or configuration?
Hi
You can share a link to those cameras?
I buy Xioami camera but it’s not work with HA