Absolutely losing my mind trying to get cameras to work

Hey all

Gotta vent here, I’m about to give up on cameras in HA. I’ve spent the last 3 days trying to get something stable and reliable to work without success. I’ve read every forum post and tried every bit of code or hack that I can find.

Here is my setup:
6 Swann (hikvision) cameras running on Surveillance Station. They all stream perfectly in the SS native app as well as thru VLC using RTSP. Zero lag or buffering…they even stream great on the DS cam over mobile data. My instance of HA is running via VMM on the same Synology Nas. All of the cameras are hardwired over PoE to a Ubiquit 8 port, and my Synology, PC, and router are all on the same physical network (no wifi)

I have several RTSP urls that have been tested working thru VLC:

  1. rtsp://user:[email protected]/live/h264

  2. rtsp://user:[email protected]:554/Streaming/Channels/1

  3. rtsp://user:[email protected]:554/mpeg4

  4. rtsp://user:[email protected]:554/live/h264
    (All of these are listed as FFMPEG in the specs)

  5. Synology can also generate a share stream path like this:
    rtsp://syno:[email protected]:554/Sms=3.unicast
    (which works in VLC)

I’ve tried all cameras in HA using both of the below methods which yield the same problems:

  1. Using the Synology integration exposes all cameras and I can easily add them to a PictureGlance card. The snapshot works, and clicking on the image brings up the live stream which loads quickly and looks fine, but without fail will start buffering after 13-15 seconds every time. It will catch up a bit, and then freeze entirely. Exiting the full screen view and going back to the Glance cards kind of re-sets everything, but it will happen again.

  2. I’ve also added all cameras using the ONVIF integration, which also allows me to easily add the cameras to the Glance cards. Again, same behavior…buffering/lagging after almost exactly 13 seconds of streaming.

I’ve also tried all snippets of code I’ve found in the forums such as the following in the config.yaml file

camera:
   - platform: generic
     stream_source: rtsp://192.168.1.190:554/Streaming/Channels/1
     still_image_url: [I don't believe these cameras support a still image]
     name: COURTYARD
     verify_ssl: false
     authentication: basic
     username: U
     password: P

I can’t get this method to work at all. If I go to Glance card I’ll put ‘camera.courtyard’ as the device but then no video will load on the card at all.

I’ve also added Stream: and ffmpeg: to config but these don’t seem to have an affect.

Can anyone point me in the right direction?? I’m at my wits end

Thanks

Change to authentication: digest

It’s complicated.

1 Like

Can you explain the difference between basic and digest?

Is there any way for the generic camera entity to not require the still_image_url?

Do I need stream: and/or ffmpeg: to be in the config file as well?

After changing it to digest the Glance tile says “Error starting stream, see logs for details (Stream never started)”, but I’ve actually never seen that error before in the window, so I think it’s progress.

When I modify the configuration.yaml, do I need to restart HA or do anything else, or are the changes instant?

I’ve tried all four urls below, with and without the user/pass in the url but I get the same error in the card.

camera:
   - platform: generic
     #stream_source: rtsp://User:[email protected]/live/h264
     #stream_source: rtsp://User:[email protected]:554/Streaming/Channels/1
     #stream_source: rtsp://User:[email protected]:554/mpeg4
     #stream_source: rtsp://User:[email protected]:554/live/h264
     still_image_url: https://www.home-assistant.io/images/merchandise/shirt-frontpage.png
     name: COURTYARD
     verify_ssl: false
     authentication: digest
     #username: user
     #password: pass

Thanks!
Do you think this is the best option? I know my cameras are a bit older but they support H.264 and all I really want is a basic security camera display dashboard.

Do you have any idea why I can’t seem to get any of the RTSP methods to work?

I have hikvision cameras and have digest enabled, works fine!

Go with WebRTC, your experience will be upgraded

Personally, I use a pretty simple setup, using RTSP and ffmpeg. Be careful with the parameters for ffmpeg. There are websites detailing some setups for various cameras. I can’t give an overview, besides that, because there are just too many parameters and options.

One of my examples:

  - platform: ffmpeg
    name: security_camera
    input: -rtsp_transport tcp -i rtsp://securitypi.local:8554/unicast
    extra_arguments: "-vf hue=s=0"

Note that the 10s lag is only a frontend thing. If you record a clip, it typically starts almost immediately. I just live with the annoyance of the lag (I don’t look at the real-time feed that often, TBH). WebRTC is fantastic, but you will face one major challenge: Making it working it externally (to your home network), depending on your setup.

Got WebRTC working last night and you’re right, the experience is definitely better.


Do you know if it’s possible to make these custom:webrtc-camera cards behave like the the picture-glance cards - eg, still image that updates every 10 seconds and then the live stream loads upon click. All 6 of them actually stream pretty smoothly, but it kind of makes the UI laggy and need to be refreshed more often. I don’t really need them streaming all the time, just when I tap on the card.

Thanks, I’ll play around a bit more. I would love to get it working in the normal Glance tile because it fits my needs more and would be less laggy to the UI, but as of yet none of my RTSP urls have worked.
With regard to the below URLs, why do you think Swann/Hikvision has 4 options for basically the same stream, and do you think one or the other would be better/faster to use?
#stream_source: rtsp://User:[email protected]/live/h264
#stream_source: rtsp://User:[email protected]:554/Streaming/Channels/1
#stream_source: rtsp://User:[email protected]:554/mpeg4
#stream_source: rtsp://User:[email protected]:554/live/h264

yes, check number 3 from here: GitHub - fuatakgun/eufy_security: Home Assistant integration to manage Eufy Security devices as cameras, home base stations, doorbells, motion and contact sensors.

in this example, I am using a binary sensor (streaming) to decide what to show between picture vs webrtc, you can use input helper and switch back and forth.

Did you use 6 webRTC cards to make that display or have they made an option to merge many streams into a single card?

Nice, I’ll check this out.

- type: conditional
  conditions:
    - entity: binary_sensor.entrance_streaming_sensor
      state: 'False'
  card:
    type: picture-entity
    entity: camera.entrance_camera
    tap_action:
      action: call-service
      service: camera.turn_on
      service_data: {}
      target:
        entity_id: camera.entrance_camera
- type: conditional
  conditions:
    - entity: binary_sensor.entrance_streaming_sensor
      state: 'True'
  card:
    type: custom:webrtc-camera
    entity: camera.entrance_camera

EDIT - I figured out this goes on two separate conditional cards. However, I receive the errors “No Image Source configured” and Conditions need to be an array"

This is how I have the camera defined in config file:

camera:
 - platform: generic
   name: Courtyard
   still_image_url: http://192.168.1.190/Streaming/channels/1/picture?snapShotImageType=JPEG
   stream_source: rtsp://User:[email protected]/Streaming/Channels/1

that’ll be because your entity should be camera.courtyard and not just Courtyard (please also note it’s all in lowercase) You could also skip the entity and use camera_image: camera.courtyard

I used 6 cards on a grid with a nested grid to achieve that effect

Ah thanks, now it’s sorta working


It looks like this in Edit mode, but when I close edit mode both cards disappear and never show up. Also the weirdest thing…that camera view isn’t the courtyard…it’s not even the same IP address. So I have no idea why that stream is showing up here.

OK your next issue is the conditions need to be an array one - did you fix that yet?
And have you actually created the binary_sensor or the input_boolean or whatever you are planning to use for the entity filter?

The array error seemed to go away when I just re-indented the code.

I honestly don’t know how to create a binary_sensor or an input_boolean

Right well the way the entity filter cards work, is that they only show up when the conditions match, so if the thing you are filtering on doesn’t exist, then the cards will never show up. You need to create the entity by going to Configuration in Home Assistant and click on “Automations and Scenes” and then the “Helpers” tab at the top of the screen. Click on add, and then choose Toggle. Call it what you want, and that will be your input_boolean created. Click on it and copy it’s full entity_id fro the settings.

Now use that in your entity filter in your lovelace, that fixes the first problem.

Next - your array issue, is because it’s horribly wrong:

- type: conditional
  conditions:
    - entity: binary_sensor.entrance_streaming_sensor
      state: 'False'

is what it should be, but what is in your screenshot is:

- type: conditional
  conditions:
      entity: binary_sensor.entrance_streaming_sensor
      state: 'False'

Because you are missing the - the condition is not an array.

Thanks for your continued help! I made the input_booleen.camerastate as a toggle.

Here is the code for the two cards

type: conditional
conditions:
  - entity: input_boolean.camerastate
    state: 'False'
card:
  type: picture-entity
  entity: camera.courtyard
  tap_action:
    action: call-service
    service: camera.turn_on
    service_data: {}
    target:
      entity_id: camera.courtyard

And

 type: conditional
conditions:
  - entity: input_boolean.camerastate
    state: 'True'
card:
  type: custom:webrtc-camera
  entity: camera.courtyard

Both cards display the correct picture/video in the Edit mode, but still disappear when out of Edit mode. Also, if I click on the first card (picture), I see an error in the bottom of the screen that says “Failed to call service camera/turn on”. Makes me wonder if these older Swann/Hik cameras even support sending the camera state?

Which I believe is this error in the log:

File "/usr/src/homeassistant/homeassistant/components/camera/__init__.py", line 551, in turn_on
raise NotImplementedError()
NotImplementedError

I wonder if there is another command I could send to the camera besides turn_on. Even just enabling or disabling the stream itself would be fine. Example: set the tap_action to start the full stream, but the default state of this View would be the picture tiles.

I guess you probably want call-service to be to input_boolean.turn_on instead of camera.turn_on if the point is to turn the input_boolean on so that it shows the webrtc card instead of the 10 second refresh card? As for the state filters themselves, instead of True and False, I would use on and off

Looking at one of my own dashboards with a conditional filter:

type: vertical-stack
cards:
  - type: conditional
    conditions:
      - entity: binary_sensor.obs_streaming
        state: 'on'
    card:
      type: picture-entity
      entity: camera.obs_output_virtual_camera
      camera_image: camera.obs_output_virtual_camera
      show_name: false
      show_state: false

That is how I show the output from OBS when it is actually broadcasting and hide it when the stream is ended.