Camera support for HomeKit component

can’t get it working… now I get in logs
(I did homekit reset accessory stuff)

Source: util/thread.py:20
First occurred: 22:38:00 (5 occurrences)
Last logged: 22:39:53

Timeout while waiting of FFmpeg

@alexmuntean Check if you have FFmpeg installed.

For my problem I have everything installed, tried with stream_source but still same logs without error and no stream.
I’ve run out of ideas…

I activated it with ffmpeg: in configuration.yaml because I’m using hassio. same error. I’m still trying to activate random stuff, probabily I’ll get it working somehow :slight_smile:

I also tried activating stream: and ffmpeg: and I can’t get it working.
I had the camera setup as generic camera and still not working.
now i changed to hassio integration with onvif and still not working.
I’m getting out of ideas :slight_smile:
I also tried the extra tcp params and transport for ffmpeg and I get a different error like can’t kill process then I removed the extra params

worked for me with

stream_source: “-rtsp_transport tcp -i rtsp://user:password@cam-ip/live”

Great integration! Setup up a cheapo Kingcam POE cam via RTSP and works like a charm! (though apple watch doesn’t play live streams (screenshots work)

My question is - is it possible to combine both a motion sensor and a camera together (via some sort of template) to enable iOS motion notifications with camera screenshots?

It not possible to combine motion sensors with the camera. Apple removed support for this when homekit secure video shipped. Apple does not release the spec for HomeKit secure video so it is not supported for open source projects. There has been some work to figure out how it all works, but support is unlikely as there isn’t viable support for the partial video captures it requires in ffmpeg.

1 Like

Is something like the way the ffmpeg plugin in Homebridge does it possible? They set up a dummy motion sensor which becomes part of the same device in HomeKit, then we can toggle that via a Shortcut that follows a Home Assistant entity, or injest it via HomeKit Controller.

Probably doable with a similar workaround

I have an issue with the audio. The video stream is working fine in HK. My config:

  entity_config:
    camera.baby:
      video_codec: copy
      audio_codec: copy
      support_audio: True

I get a “MPEG AAC Audio (mp4a)” stream from the camera. Is this an issue for HK? Can i change the audio codec above to e.g. mp4a?

Here is branch that implements adding a linked_motion_sensor and a PR : https://github.com/home-assistant/core/pull/35994

I have too many homekit PRs that are waiting for review right now to move it forward though.

1 Like

@bdraco What could be the reason for the Homekit cameras working perfectly on iPhone, iPad, Mac but not on the watch? The stills are regularly updated on the watch, but whenever I want to see the livestream, it does not work Tries to connect and then “Not available”.

I haven’t had a chance to investigate why they don’t work on the watch. Right now there is a backlog of HomeKit PRs waiting to merge so it will likely take a while before it comes to the top of the queue.

I suspect we aren’t advertising a stream resolution or other setting that is compatible with the watch. We may also need to implement multiple stream support before it’s possible which requires a bit of refactoring of the upstream library which is also backed up on PRs

1 Like

@bdraco Thanks bdraco! …and please keep on the great work!

I’m using Blue Iris as my NVR, and I wanted to connect to the Blue Iris server since the video streams start much faster than connecting to the individual cameras. After much trial and error, this config sort of worked for me:

# Cameras
stream:

ffmpeg:

camera:

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/uppfart
    name: Uppfart
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/altan
    name: Altan
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

This config generated camera entities and allowed me to configure cards in Lovelace with live views in home assistant. I could also get a live feed when pasting the mjpeg_url in a browser.

As soon as I added the entities in the homekit section, the cameras were added to the Home app, but I got the same error as @alexmuntean where the thumbnail in the Home app on my iphone would update every 10 seconds, but the live stream didn’t work.

So I changed the plattform to ffmpeg, or rather, added 2 new cameras using ffmpeg:

# Cameras
stream:

ffmpeg:

camera:

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/uppfart
    name: Uppfart
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: mjpeg
    mjpeg_url: http://192.168.0.133:81/mjpg/altan
    name: Altan
    username: !secret blueiris_username
    password: !secret blueiris_password
    authentication: basic

  - platform: ffmpeg
    name: uppfart_ffmpeg
    input: http://<uname>:<passwd>@192.168.0.133:81/mjpg/uppfart

  - platform: ffmpeg
    name: altan_ffmpeg
    input: http://<uname>:<passwd>@192.168.0.133:81/mjpg/altan

I now inluded the camera.uppfart_ffmpeg and camera.altan_ffmpeg in the homekit section and kept the original definitions for the HA cards (the ffmpeg cameras didn’t give live views in HA):

# Homekit integration
homekit:
  auto_start: false
#  safe_mode: true

  filter:
    include_entities:
      - camera.uppfart_ffmpeg
      - camera.altan_ffmpeg

This almost works: I now get thumbnails every 10 seconds, and I can view the camera “camera.uppfart_ffmpeg” but I can’t get a live feed for the camera named “camera.altan_ffmpeg”. The log files gives me this:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/homekit/type_cameras.py", line 321, in stop_stream
    await getattr(stream, shutdown_method)()
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 158, in close
    await self._loop.run_in_executor(None, _close)
  File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 153, in _close
    self._proc.stdin.write(b"q")
BrokenPipeError: [Errno 32] Broken pipe
2020-05-22 12:02:32 ERROR (SyncWorker_2) [homeassistant.components.homekit.type_cameras] [0a56bbd7-6ff2-41f4-8a8d-b60af9cca47d] Failed to kill stream.
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/homekit/type_cameras.py", line 321, in stop_stream
    await getattr(stream, shutdown_method)()
  File "/usr/local/lib/python3.7/site-packages/haffmpeg/core.py", line 170, in kill
    self._proc.kill()
AttributeError: 'NoneType' object has no attribute 'kill'

Any ideas what might go wrong?

@bdraco is right. I think I remember from using the homebridge-camera-ffmpeg plugin you might have to use 1280x720 for Apple Watch.

Video door bell support? That is now my only remaining requirement for Homebridge.

The doorbell adds a button and makes it the primary service. There isn’t a standard (or at least one that is generally followed) for how doorbell buttons get implemented so it would be helpful to know which video doorbell you have or what you have built to trigger the button press.

At the moment I have logic in NodeRed that POST’s the ding=dong&dong=ding message for various reasons. Outside of that, the camera is setup exactly the same. I’m using Blue Iris.

Finally got it working and thought I might share my solution. I’m using Hikvision cameras (I abandoned the idea of using Blue Iris for now since I couldn’t figure out a working stream source), but I found a great page to figure out your device specific RTSP stream URL here.

First I had to turn off H.264+ in my cameras and stick with H.264 encoding since this was messing up the decoding in the Home app and making the streams take quite a while to render in Lovelace. I found this setting in my camera under “Configuration (top menu)->Video and sound (side menu)”. The camera rebooted and already I could see the video stream display faster in Lovelace.

Second I turned off authentication for RTSP-streams for testing under “Configuration (top men)->System (side menu)->Security (sub side menu)->Verification”

This is my config:

# Cameras
stream:

camera:

  - platform: generic
    still_image_url: "http://username:[email protected]/Streaming/Channels/1/picture"
    stream_source: "rtsp://192.168.0.220/HighResolutionVideo"
    name: Uppfart

  - platform: generic
    still_image_url: "http://username:[email protected]/Streaming/Channels/1/picture"
    stream_source: "rtsp://192.168.0.218/HighResolutionVideo"
    name: Altan

Once that was done I simply had to add the camera entitites into the homekit config, restart HA and I was done. I could now see the cameras in Lovelace and in the Home app, with the thumbnail updating correctly every 10 seconds:

# Homekit integration
homekit:
  auto_start: false
#  safe_mode: true

######################################
# Section dedicated to camera config #
######################################
  entity_config:
    camera.uppfart:
#       video_codec: copy
#       support_audio: False
       stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.0.220:554/Streaming/Channels/102"
      # Set maximums for negotiating resolutions
       max_fps: 15
       max_width: 1280
       max_height: 1024
    camera.altan:
#       video_codec: copy
#       support_audio: False
       stream_source: "-rtsp_transport tcp -re -i rtsp://192.168.0.218:554/Streaming/Channels/102"
      # Set maximums for negotiating resolutions
       max_fps: 15
       max_width: 1280
       max_height: 1024
#############################
# End camera config section #
#############################

Edit: had to update the stream_source to use the low resolution stream since the high resolution stream was having issues with live streaming.