Camera support for HomeKit component

I’m using Blue Iris as my NVR, and I wanted to connect to the Blue Iris server since the video streams start much faster than connecting to the individual cameras. After much trial and error, this config sort of worked for me:

# Cameras
stream:

ffmpeg:

camera:

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/uppfart
    name: Uppfart
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/altan
    name: Altan
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

This config generated camera entities and allowed me to configure cards in Lovelace with live views in home assistant. I could also get a live feed when pasting the mjpeg_url in a browser.

As soon as I added the entities in the homekit section, the cameras were added to the Home app, but I got the same error as @alexmuntean where the thumbnail in the Home app on my iphone would update every 10 seconds, but the live stream didn’t work.

So I changed the plattform to ffmpeg, or rather, added 2 new cameras using ffmpeg:

# Cameras
stream:

ffmpeg:

camera:

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/uppfart
    name: Uppfart
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/altan
    name: Altan
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: ffmpeg
    name: uppfart_ffmpeg
    input: http://<uname>:<passwd>@192.168.0.133:81/mjpg/uppfart

  - platform: ffmpeg
    name: altan_ffmpeg
    input: http://<uname>:<passwd>@192.168.0.133:81/mjpg/altan

I now inluded the camera.uppfart_ffmpeg and camera.altan_ffmpeg in the homekit section and kept the original definitions for the HA cards (the ffmpeg cameras didn’t give live views in HA):

# Homekit integration
homekit:
  auto_start: false
#  safe_mode: true

  filter:
    include_entities:
      - camera.uppfart_ffmpeg
      - camera.altan_ffmpeg

This almost works: I now get thumbnails every 10 seconds, and I can view the camera “camera.uppfart_ffmpeg” but I can’t get a live feed for the camera named “camera.altan_ffmpeg”. The log files gives me this:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/homekit/type_cameras.py", line 321, in stop_stream
    await getattr(stream, shutdown_method)()
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 158, in close
    await self._loop.run_in_executor(None, _close)
  File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 153, in _close
    self._proc.stdin.write(b"q")
BrokenPipeError: [Errno 32] Broken pipe
2020-05-22 12:02:32 ERROR (SyncWorker_2) [homeassistant.components.homekit.type_cameras] [0a56bbd7-6ff2-41f4-8a8d-b60af9cca47d] Failed to kill stream.
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/homekit/type_cameras.py", line 321, in stop_stream
    await getattr(stream, shutdown_method)()
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 170, in kill
    self._proc.kill()
AttributeError: 'NoneType' object has no attribute 'kill'

Any ideas what might go wrong?

@bdraco is right. I think I remember from using the homebridge-camera-ffmpeg plugin you might have to use 1280x720 for Apple Watch.

Video door bell support? That is now my only remaining requirement for Homebridge.

The doorbell adds a button and makes it the primary service. There isn’t a standard (or at least one that is generally followed) for how doorbell buttons get implemented so it would be helpful to know which video doorbell you have or what you have built to trigger the button press.

At the moment I have logic in NodeRed that POST’s the ding=dong&dong=ding message for various reasons. Outside of that, the camera is setup exactly the same. I’m using Blue Iris.

Finally got it working and thought I might share my solution. I’m using Hikvision cameras (I abandoned the idea of using Blue Iris for now since I couldn’t figure out a working stream source), but I found a great page to figure out your device specific RTSP stream URL here.

First I had to turn off H.264+ in my cameras and stick with H.264 encoding since this was messing up the decoding in the Home app and making the streams take quite a while to render in Lovelace. I found this setting in my camera under “Configuration (top menu)->Video and sound (side menu)”. The camera rebooted and already I could see the video stream display faster in Lovelace.

Second I turned off authentication for RTSP-streams for testing under “Configuration (top men)->System (side menu)->Security (sub side menu)->Verification”

This is my config:

# Cameras
stream:

camera:

  - platform: generic
    still_image_url: "http://username:[email protected]/Streaming/Channels/1/picture"
    stream_source: "rtsp://192.168.0.220/HighResolutionVideo"
    name: Uppfart

  - platform: generic
    still_image_url: "http://username:[email protected]/Streaming/Channels/1/picture"
    stream_source: "rtsp://192.168.0.218/HighResolutionVideo"
    name: Altan

Once that was done I simply had to add the camera entitites into the homekit config, restart HA and I was done. I could now see the cameras in Lovelace and in the Home app, with the thumbnail updating correctly every 10 seconds:

# Homekit integration
homekit:
  auto_start: false
#  safe_mode: true

######################################
# Section dedicated to camera config #
######################################
  entity_config:
    camera.uppfart:
#       video_codec: copy
#       support_audio: False
       stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.0.220:554/Streaming/Channels/102"
      # Set maximums for negotiating resolutions
       max_fps: 15
       max_width: 1280
       max_height: 1024
    camera.altan:
#       video_codec: copy
#       support_audio: False
       stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.0.218:554/Streaming/Channels/102"
      # Set maximums for negotiating resolutions
       max_fps: 15
       max_width: 1280
       max_height: 1024
#############################
# End camera config section #
#############################

Edit: had to update the stream_source to use the low resolution stream since the high resolution stream was having issues with live streaming.

I was also having issues getting the live feed to work for the camera even though I was getting the still image. Managed to resolve it by using the settings I had been using in Homebridge:

stream_source: "-re -f mjpeg -i http://<username>:<password>@<ip>:<port>/video.cgi"

I have an old D-Link camera and added it to Home Assistant as a camera using the mjpeg platform and for this configuration the stream works just with this:

mjpeg_url: http://<username>:<password>@<ip>:<port>/video.cgi

To the people who get this to work:

  • Are you able to stream outside your LAN?
  • Are you using home-assistante cloud?

Background
I think I’m understanding why some people may have problems with the camera-integration.
I tried everything, from generic camera, to ffmpeg to uvc-integration for my Unifi G3.

Since I’m not at home and will not be for several days, I looked into the firewall to see if something looked funny. (In my mind all traffic through homekit should be routed through Apples servers, that’s how homekit works without vpn or reverse proxy).

Anyhow, in the package capture I saw that my iPhone was trying to connect to my AppleTVs local IP-adress, but also the public-adress of my hass-server (this does not make sense since it’s running reverse proxy behind cloudflare).
I haven’t look into the code, I may do that when I have the time. But I guess it’s fetching the public IP-adress of hass, and tries to connect through that. Why it does that is beyond my mind, shouldn’t it be connecting to Apple’s servers?
And why is it trying to connect to my Apple TVs internal IP?

Is Apple treating streaming-devices differently, compared to the other devices? (Since they go through Apple’s server of what I’m understanding)
My firewall (were I am at now and at home both runs pfsense) blocks the requests to my appleTV and to the public IP of hass.

I’m guessing why the snapshot works is because it’s taking the snapshot from the camera proxy?

My setup:

  • Hass running on raspberry pi using reverse proxy and cloudflare
  • Homekit using AppleTV
  • iPhone running iOS 13.5.

@bdraco I’m happy to help anyone of you debugging, just let me know.

HomeKit uses Apple TVs as the hub which might be why you are seeing it trying to direct connect to it. I have a fairly complex set of VLANs and FW rules, deny by default, but Home Assistant should have the proper access to the Apple TVs, Camera Streams, and end user devices. The fact that the still images work on mine makes me believe it is something with how it is trying to encode / re-encode the stream URL not working properly.

Yeah, I’m aware of that appleTV is used as a hub. But the request should not go directly from my iphone to the AppleTV. And neither directly to the public IP of hass, which is not “public” since it’s behind cloudflare.

The traffic should look like:
Iphone -> Apple’s servers -> AppleTV through NAT -> AppleTV (local IP) -> Homekit-compatible deivce (like hass)

Not:
iPhone -> AppleTV (local IP)

No?

The reason snapshot works is due to this part of the code:

    def get_snapshot(self, image_size):
        """Return a jpeg of a snapshot from the camera."""
        return scale_jpeg_camera_image(
            asyncio.run_coroutine_threadsafe(
                self.hass.components.camera.async_get_image(self.entity_id),
                self.hass.loop,
            ).result(),
            image_size["image-width"],
            image_size["image-height"],
        )

Edit: I just confirmed that there is no direct traffic to either the public ip of hass or internal ip of AppleTV, by disabling the camera in homekit. Even when interacting with the devices in homekit, eveything goes through apple’s servers.

I don’t have the advanced setup that you do, but it works when I turn off WiFi and connect from the mobile network. But the solution still seems a bit wonky; I intermittently get “No response - wait until someone else in this home stops viewing this camera and try again” but the same stream works in VLC.

I’m thinking about holding off until the next update to continue testing since there is no reason it should work one minute and not the next, and then start working again if everything is static.

So you have a proper stream when using mobile connection? I wonder if it’s because you’re using mjpeg. Are you able to test rtsp? I was not able to get mjpeg to work.

I’m using rtsp, my config is a few posts above. But I’m not a developer so I can’t really troubleshoot the code…

My streaming works now, the thumbnail does not updates as often as every 10s but atleast every time I open Hope-app. I did not change a thing for my setup to work, it just did.

My iPhone still tries to connect to my public IP and to my internal IP of hass, with no luck. But the streaming still works

If you’re using rtsp, try this to pass this into stream_source:

stream_source: "-re -rtsp_transport tcp -i rtsp://XXX.."

Ok I finally got stream working for my Unifi G3 Micro camera. Yay!
I had to setup firewall rule on my Unifi USG Router to allow iPhone local IP be fully accessible from my raspberry pi where HA is installed.
I have IOT devices (together with raspberry pi) on different network and disallowing access to my main network. I only allow response (ESTABLISHED and RELATED) from IOT to MAIN, but allow full access from MAIN to IOT.
So if my iPhone is making request to HA and HA is making request to camera it should already have access back to iPhone. But it didn’t work… I guess it was making a new connection back to iPhone and firewall was blocking it. When I gave full access from HA to iPhone it started to work.

It also works remotely, so you don’t need to be on local wifi to see the stream.
But for this to work you need to set firewall rule to allow HA to access Apple TV or other Homekit Hub.

Is there any other way to make this work without giving HA firewall access to my iPhone/HUB ?

Here is my config:

camera:
  - platform: uvc
    nvr: 192.168.X.X
    key: AAAAAAAABBBBBBCCC
    password: PASSWORD

homekit:
  entity_config:
    camera.child_room:
      stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.X.X:7447/YYYYYYYYYYYYYYYY_0"
      video_codec: copy
      support_audio: True
      max_width: 1920
      max_height: 1080
      max_fps: 30
2 Likes

My experience so far: I’m using a Hikvision camera configured as rtsp. The video_codec is configured as “copy”. The live feed plays a few seconds, freezes a bit, plats a few seconds, … The same rtsp url directly in HA or in VLC plays without any freezes.

Is that a HA issue? With the “copy” setting I was under the impression that HomeKit connects directly to the rtsp of the camera. Or is HA still doing some processing?

Already happy to have the stills though :slight_smile:

I have Hikvision also and am using the “copy” setting. It drops cpu a lot, but ffmpeg is still involved. Like:

ffmpeg -i rtsp://<YOUR_RTSP_URL> -map 0:v:0 -an -c:v copy -tune zerolatency -pix_fmt yuv420p -r 30 -b:v 299k -bufsize 1196k -maxrate 299k -payload_type 99 -ssrc 6775157 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params kxLwd+YVFAu1nP+SxOIrvZD3NF0TwlFRk5MG6GDR srtp://<CLIENT_IP_AND _PORT>?rtcpport=57730&localrtcpport=57730&pkt_size=1316

Quite a bit going on there beyond “copy” that can affect things. :slight_smile: Stuff like maxrate seem interesting but I don’t see a corresponding setting in HA. I was not able to get smooth streaming with my 1080P streams, but moving to my lower res 480P sub-streams works fine. Just haven’t had time to see if things could be tweaked further, but seemed doubtful based on the settings I saw for the integration.

I’ve just opened a separate feature request for Secure Video support over here.

There seems to be some progress towards understanding the underlying protocol.

Hello everyone!

I have a camera entity receiving picture from MQTT.
It’s map of vacuum (Roborock on Valetudo firmware).


I’m not sure about how it’s working in details, but I think HA receives static PNGs about every 10 seconds.

When I tried to configure it in homekit integration, it didn’t work: there were cameras, but without picture on iOS device and there were errors like “This camera has no stream” in HA logs.

Maybe someone has already had this problem?
Is there any opportunity to integrate this camera entity to homekit?

The easiest way to stream any camera through HomeKit is by enabling motioneye and then under “video streaming” you’ll see “Useful URLs”, copy the streaming URL and paste it like this:

entity_config:
    camera.cam:
      stream_source: "http://192.168.1.10:9090"