Camera support for HomeKit component

To the people who get this to work:

  • Are you able to stream outside your LAN?
  • Are you using home-assistante cloud?

I think I’m understanding why some people may have problems with the camera-integration.
I tried everything, from generic camera, to ffmpeg to uvc-integration for my Unifi G3.

Since I’m not at home and will not be for several days, I looked into the firewall to see if something looked funny. (In my mind all traffic through homekit should be routed through Apples servers, that’s how homekit works without vpn or reverse proxy).

Anyhow, in the package capture I saw that my iPhone was trying to connect to my AppleTVs local IP-adress, but also the public-adress of my hass-server (this does not make sense since it’s running reverse proxy behind cloudflare).
I haven’t look into the code, I may do that when I have the time. But I guess it’s fetching the public IP-adress of hass, and tries to connect through that. Why it does that is beyond my mind, shouldn’t it be connecting to Apple’s servers?
And why is it trying to connect to my Apple TVs internal IP?

Is Apple treating streaming-devices differently, compared to the other devices? (Since they go through Apple’s server of what I’m understanding)
My firewall (were I am at now and at home both runs pfsense) blocks the requests to my appleTV and to the public IP of hass.

I’m guessing why the snapshot works is because it’s taking the snapshot from the camera proxy?

My setup:

  • Hass running on raspberry pi using reverse proxy and cloudflare
  • Homekit using AppleTV
  • iPhone running iOS 13.5.

@bdraco I’m happy to help anyone of you debugging, just let me know.

HomeKit uses Apple TVs as the hub which might be why you are seeing it trying to direct connect to it. I have a fairly complex set of VLANs and FW rules, deny by default, but Home Assistant should have the proper access to the Apple TVs, Camera Streams, and end user devices. The fact that the still images work on mine makes me believe it is something with how it is trying to encode / re-encode the stream URL not working properly.

Yeah, I’m aware of that appleTV is used as a hub. But the request should not go directly from my iphone to the AppleTV. And neither directly to the public IP of hass, which is not “public” since it’s behind cloudflare.

The traffic should look like:
Iphone -> Apple’s servers -> AppleTV through NAT -> AppleTV (local IP) -> Homekit-compatible deivce (like hass)

iPhone -> AppleTV (local IP)


The reason snapshot works is due to this part of the code:

    def get_snapshot(self, image_size):
        """Return a jpeg of a snapshot from the camera."""
        return scale_jpeg_camera_image(

Edit: I just confirmed that there is no direct traffic to either the public ip of hass or internal ip of AppleTV, by disabling the camera in homekit. Even when interacting with the devices in homekit, eveything goes through apple’s servers.

I don’t have the advanced setup that you do, but it works when I turn off WiFi and connect from the mobile network. But the solution still seems a bit wonky; I intermittently get “No response - wait until someone else in this home stops viewing this camera and try again” but the same stream works in VLC.

I’m thinking about holding off until the next update to continue testing since there is no reason it should work one minute and not the next, and then start working again if everything is static.

So you have a proper stream when using mobile connection? I wonder if it’s because you’re using mjpeg. Are you able to test rtsp? I was not able to get mjpeg to work.

I’m using rtsp, my config is a few posts above. But I’m not a developer so I can’t really troubleshoot the code…

My streaming works now, the thumbnail does not updates as often as every 10s but atleast every time I open Hope-app. I did not change a thing for my setup to work, it just did.

My iPhone still tries to connect to my public IP and to my internal IP of hass, with no luck. But the streaming still works

If you’re using rtsp, try this to pass this into stream_source:

stream_source: "-re -rtsp_transport tcp -i rtsp://XXX.."

Ok I finally got stream working for my Unifi G3 Micro camera. Yay!
I had to setup firewall rule on my Unifi USG Router to allow iPhone local IP be fully accessible from my raspberry pi where HA is installed.
I have IOT devices (together with raspberry pi) on different network and disallowing access to my main network. I only allow response (ESTABLISHED and RELATED) from IOT to MAIN, but allow full access from MAIN to IOT.
So if my iPhone is making request to HA and HA is making request to camera it should already have access back to iPhone. But it didn’t work… I guess it was making a new connection back to iPhone and firewall was blocking it. When I gave full access from HA to iPhone it started to work.

It also works remotely, so you don’t need to be on local wifi to see the stream.
But for this to work you need to set firewall rule to allow HA to access Apple TV or other Homekit Hub.

Is there any other way to make this work without giving HA firewall access to my iPhone/HUB ?

Here is my config:

  - platform: uvc
    nvr: 192.168.X.X
    password: PASSWORD

      stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.X.X:7447/YYYYYYYYYYYYYYYY_0"
      video_codec: copy
      support_audio: True
      max_width: 1920
      max_height: 1080
      max_fps: 30
1 Like

My experience so far: I’m using a Hikvision camera configured as rtsp. The video_codec is configured as “copy”. The live feed plays a few seconds, freezes a bit, plats a few seconds, … The same rtsp url directly in HA or in VLC plays without any freezes.

Is that a HA issue? With the “copy” setting I was under the impression that HomeKit connects directly to the rtsp of the camera. Or is HA still doing some processing?

Already happy to have the stills though :slight_smile:

I have Hikvision also and am using the “copy” setting. It drops cpu a lot, but ffmpeg is still involved. Like:

ffmpeg -i rtsp://<YOUR_RTSP_URL> -map 0:v:0 -an -c:v copy -tune zerolatency -pix_fmt yuv420p -r 30 -b:v 299k -bufsize 1196k -maxrate 299k -payload_type 99 -ssrc 6775157 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params kxLwd+YVFAu1nP+SxOIrvZD3NF0TwlFRk5MG6GDR srtp://<CLIENT_IP_AND _PORT>?rtcpport=57730&localrtcpport=57730&pkt_size=1316

Quite a bit going on there beyond “copy” that can affect things. :slight_smile: Stuff like maxrate seem interesting but I don’t see a corresponding setting in HA. I was not able to get smooth streaming with my 1080P streams, but moving to my lower res 480P sub-streams works fine. Just haven’t had time to see if things could be tweaked further, but seemed doubtful based on the settings I saw for the integration.

I’ve just opened a separate feature request for Secure Video support over here.

There seems to be some progress towards understanding the underlying protocol.

Hello everyone!

I have a camera entity receiving picture from MQTT.
It’s map of vacuum (Roborock on Valetudo firmware).

I’m not sure about how it’s working in details, but I think HA receives static PNGs about every 10 seconds.

When I tried to configure it in homekit integration, it didn’t work: there were cameras, but without picture on iOS device and there were errors like “This camera has no stream” in HA logs.

Maybe someone has already had this problem?
Is there any opportunity to integrate this camera entity to homekit?

The easiest way to stream any camera through HomeKit is by enabling motioneye and then under “video streaming” you’ll see “Useful URLs”, copy the streaming URL and paste it like this:

      stream_source: ""

Thank you for your answer!
I’ve installed MotionEye official add-on, but I can’t add my camera to it.
I think my format is not supported. I can get camera image from MQTT or PNG from url.
When I’m trying to add url to MotionEye, it reports ‘not supported’ :frowning:

Hi, I added the ffmpeg to my configuration and I use the generic platform for the camera. I didn’t add something to homekit:
and it seems to work

Sorry team, I got a bit lost xD

Are we able now to see our HA cameras in homekit? is it a custom component?

Thank you in advance!

Just to help out with Blue Iris, I thought I’d share my working setup. I have cameras of differing brands managed by Blue Iris, and I have them fed through Home Assistant for HomeKit connectivity. There are a couple of things that are important:

Given that I have required my blueris setup to use secure connections and a login page, one thing that I’ve done which you might decide to do (or not), is within Blueiris, under “Settings -> Web Server -> Advanced” and added my HomeAssistant IP as an exclusion by typing in the box (as example):


My Home Assistant Config is the below. It is a bit more involved because I am also using the blueiris triggers to set MQTT flag, which in turn Home Assistant uses to pop the video display (this was
introduced in Home Assistant version In the below

“cam0” “cam1” etc are the Blue Iris camera short names. is the sample Blue Iris PC IP address
The motion triggers are setup when the Alert occurs. That’s a bit more involved, so I won’t get into that here… Can be googled.
The blue iris m3u8 OR ts stream are the URLs to use. Try each to see which works best for you.

With this setup everything that I’ve tried works. The preview, the live view, audio, alert pop-ups, etc.

############### MQTT
  broker: MQTT

  - platform: mqtt
    name: "cam0_Motion"
    state_topic: blue_iris/binary_sensor/cam0_motion/state
    payload_on: "ON"
    payload_off: "OFF"
    device_class: motion

  - platform: mqtt
    name: "cam1_Motion"
    state_topic: blue_iris/binary_sensor/cam1_motion/state
    payload_on: "ON"
    payload_off: "OFF"
    device_class: motion

  - platform: mqtt
    name: "cam2_Motion"
    state_topic: blue_iris/binary_sensor/cam2_motion/state
    payload_on: "ON"
    payload_off: "OFF"
    device_class: motion


###############Blue Iris Cameras
  - platform: ffmpeg
    name: cam0
  - platform: ffmpeg
    name: cam1
  - platform: ffmpeg
    name: cam2

############### Forwarding to Apple Homekit
      - binary_sensor.cam0_Motion
      - binary_sensor.cam1_Motion
      - binary_sensor.cam2_Motion
      - camera.cam0
      - camera.cam1
      - camera.cam2
       name: cam0 Motion
       name: cam1 Motion
       name: cam2 Motion
       name: cam0
       support_audio: True
       linked_motion_sensor: binary_sensor.cam0_Motion
       name: cam1
       support_audio: True
       linked_motion_sensor: binary_sensor.cam1_Motion
       name: cam2
       support_audio: True
       linked_motion_sensor: binary_sensor.cam2_Motion

PS. here is a good link to some added info:

/h264/{cam-short-name}/temp.h264 	Pull a raw H.264 stream (MIME type video/H264).  This stream will play in a tool like VLC, and may be used in future versions of the ActiveX control.
/h264/{cam-short-name}/temp.ts 	Pull an MPEG-2 transport stream (MIME type video/MP2T).
/h264/{cam-short-name}/temp.m or .m3u8 	Pull a virtual M3U8 file (MIME type application/  This will play in QuickTime, iPad and the iPhone using the iPhone Live Streaming format.

Closing this as the feature has been implemented! :tada: