Rtsp stream support for camera

When we are talking about camera rtsp support in hass, it never means a real time video but a few pictures per minute, no? Because that is what I get when using platform: ffmpeg.

It is video if you click on your camera in HASS.

Hi, thanks for the direction. But still cannot see a screenshot when doorbell rings. How should the code be like in an automation script?

For me it is not a video. I have a h264 video with a mp4 container delivered as a rtsp stream and I only have still images when using ffmpeg. There is so far no way to get these stream in hmtl5 except you use some javasript libraries. And this is likely not implemented in hass. So if this combination of codecs works for you I would be really suprised.

It is good to be surprised.
Technically, it is not a video stream, however, you can watch your camera in real time and it fits for purpose.
If you want real video, there are other ways to implement it but for what.

I have about 2 pictures per minute with 40 seconds delay. So not at all a video ;-). But I can cross check with a high performance PC, maybe it is just my RPI being slow. I actually started to read about html5 and rtsp to make real time video possible in hass. A way at the moment is to use ffmpeg to change the container on the fly to a m3u8 stream (no reencoding so even RPI can do it) and use video.js to include the stream in a html page. Then use the iframe feature of hass to link to the www/html page. Even works with https. The best way would be to fake the moov of the h254 stream on the fly to an infinite duration with ffmpeg to serve a fake stream. Then every html5 able browser can show it out of the box, no javascript needed. Unfortunately ffmpeg does not have this feature and one would have to fork-patch it. Did somebody come up with a better way?

Sorry, but it seems to me that you are complicating things a bit.
Even when you do not open camera, the pictures will update approximately every 10 seconds in the UI.
I have h264 video delivered as a rtsp stream as you and I donā€™t have any delays as you. Itā€™s really real time picture on my RPi3, previously, I ran the same configuratio on Windows system with not very powerful CPU and it was the same quality.
I would suggest to check first some other issues that you might have with your configuration, local network and ffmpeg itself.

Mhh weird. Do you also have a full HD stream or 720p only?

Actually something is really wrong with the camera component, likely due to async. I get when I try to open the video:

17-01-13 19:01:25 aiohttp.server: Error handling request 
BrokenPipeError: [Errno 32] Broken pipe
The above exception was the direct cause of the following exception: 
Traceback (most recent call last): 
File "/home/pi/.homeassistant/deps/aiohttp/web_server.py", line 61, in handle_request resp = yield from self._handler(request) 
File "/home/pi/.homeassistant/deps/aiohttp/web.py", line 249, in _handle resp = yield from handler(request) 
File "/usr/local/lib/python3.5/asyncio/coroutines.py", line 209, in coro res = yield from res 
File "/usr/local/lib/python3.5/asyncio/coroutines.py", line 209, in coro res = yield from res 
File "/home/pi/git/home-assistant-fork/home-assistant/homeassistant/components/http/__init__.py", line 427, in handle result = yield from result 
File "/home/pi/git/home-assistant-fork/home-assistant/homeassistant/components/camera/__init__.py", line 184, in get response = yield from self.handle(request, camera) 
File "/home/pi/git/home-assistant-fork/home-assistant/homeassistant/components/camera/__init__.py", line 219, in handle yield from camera.handle_async_mjpeg_stream(request) 
File "/home/pi/git/home-assistant-fork/home-assistant/homeassistant/components/camera/ffmpeg.py", line 89, in handle_async_mjpeg_stream yield from response.write_eof() 
File "/home/pi/.homeassistant/deps/aiohttp/web_reqrep.py", line 897, in write_eof yield from self._resp_impl.write_eof() 
File "/usr/local/lib/python3.5/asyncio/streams.py", line 323, in drain raise exc aiohttp.errors.ClientDisconnectedError

Opened bug report.

I have a squared xiaomi camera (aka xiaofang) running this hack and using ffmpeg component as indicated above.

Here is my configuration:

camera:
  - platform: ffmpeg
  name: Xiaofang
  input: -rtsp_transport tcp -i rtsp://192.168.1.105/unicast

But for me is imposible to get a live stream of the camera. When I click on the camera entity in the frontend I cannot see even the last capture. I suppose by your comments that this is a wrong behavior. Isnā€™t it?

Any idea? Thanks in advance

2 Likes

I also do not see anything when I click on the video. The camera component seems to be quite buggy.

same issue here with live streaming. The camera (Xiaomi Ants) does show in HA but when clicking on the video there is no streaming. Also I still cannot get a screenshot via notification when the doorbell rings. Any help is appreciated.

I have the same camera as you and am experiencing the same issue. Snapshots on frontend are working but when I click on the feed, there is not stream (broken image icon).

Fang Hacks

Edit: broken image icon gone now, but still no stream.

I see no trouble on my test system. If possible that the camera only support one stream? It need to support multible connection to work with hass.

You can send my your debug log on gitter and I can try to help.

@DavidLP we have update aiohttp and all new exception are handle on 0.36. I have a lot of cameras and they work without any issue. But hass copy all stream trought event loop and they can make some small lags with process other stuff in loop. If we change to multiprocessing with async they will fix, since all component will going to own process and loop.

camera:
  - platform: ffmpeg
    name: Xiaofang1
    input: -rtsp_transport tcp -i rtsp://192.168.1.225/unicast

This is the config I am using. Not sure about multiple streams but the HA frontend updates the snapshot while I view the stream through an android app. When I click on the camera image in the frontend, it brings up the window for the camera but does not produce the feed. Sometimes it brings up a static image of what I think is the first snapshot it takes upon restart and just keeps that one in memory even if the frontend snapshot has updated.

To shine some light on the live video feature: I guess right now it is bug free, but will not work an a small form factor computer because for live stream ffmpeg encodes the rtsp stream on the fly to mjpeg. Look here:

That uses 20 % CPU on a high-end machine with 8 cores @ 4.4 GHz ;-). No way that this works on a RPi, except the original video stream is already mjpeg and no transcoding is neededā€¦

How one could get a real live video feature working in hass without transcoding I mentioned in a previous post.

Now Iā€™m experiencing the same bug as you described before. It takes time to get a picture in the UI and there is no live video now, just 1 frame in 30-40 seconds.
Iā€™ve been using ffmpeg for several months on a Win device and a few weeks on Raspberry Pi 3 and didnā€™t have this problem.
The only thing that I have done is installed Bluetooth LE tracker today which seems significantly affected the performance of RPi3.

You have to change the resolution of the camera in order for it to work on live

Managed to get it to work. You have to add extra_arguments: -pred 1 for the live feed to work.

Two questions, though:

  • Live feed is very choppy and slow (it takes about 10sec to update image, it can miss a person passing by entirely), is this normal or do I have something badly configured?
  • While this script is configured, I canā€™t access the camera via the Mihome app? (my cam is offline in the app, trying to check if itā€™s normal or something is happening)

In my case I had to disable BLE tracker to make it work smoothly as before.

The fang hack disables the mi home functionality.