Amcrest IP Camera Component Enhancements - PTZ control and audio streaming

lag (delay) between real life and the live video stream that shows up in home assistant lovelace web interface, using the above configuration.

Just wondering if there is a way to reduce that. The app or web page doesn’t have this much lag, it makes me think that home assistant is processing the video stream, and I wonder if there is a better way to reduce the lag , ie rtsp vs onvif vs mjpeg vs snapshot.

Hey Phil/@pnbruckner - I’ve made a few updates to this component to allow full PTZ control (right/left/up/down/zoom-in, etc.)

I’ve never worked on component development before and am brand new to Python, but it was a fun challenge, and seems to work pretty well. The only issue with it is the lag between arrow movement clicks and the image refresh that shows the movement. This is about 10-12 seconds for a still or live image.

I understand that you are working on incorporating subscription to Amcrest events (versus polling) so a new update may be imminent. Would you be interested in looking at my code changes to camera.py and services.yaml for possible inclusion in your PR? Or should I delve into the deep end and submit my own PR on the current prod code?


1 Like

I actually just did submit the PR before I saw this. They should probably be separate anyway. I’d be happy to look over your changes.

1 Like

Ok, great! I had not yet forked homeassistant and just updated camera.py/services.yaml directly for personal testing. So how do you prefer to receive this source? If necessary, I’ll do the PR submit if you say so.

P.S. I have also incorporated the 1 small change to camera.py in your most recent PR.

Probably best if you check them into a branch on a fork if you can do that. If not then just email them to me.

Done. PR has been submitted! Checks in progress.

FYI, I started looking over the changes. I hope to provide an official review of the PR soon.

Thanks Phil. Are any of your personal cameras PT/PTZ? One thing I tried to improve was the lag between a ptz_control move and the snapshot or live image refresh that captured the move. Ideally it should be in real time, or at least a few seconds.

All my testing was done on a remote PTZ wifi camera , so some lag could just be my network there. But I think it has more to do with the built-in 10 second delay between snapshots or the caching of streamed images.

I tried various things to force an immediate image update but they didn’t pan out. So anyway, camera movement works fine, you just have to be patient to see it move.

Yes, I have one PTZ capable camera. I haven’t tried any of this (yet.) About to submit my review on the PR.

BTW, I believe the lag is due to the streaming system (assuming you’re using that.) Not sure there’s anything that can be done about that.

EDIT: By “streaming system”, I’m referring to the stream component.

Yeah, I didn’t think anything could be done about a live stream. That has been taking about 12 seconds in my testing for the live view to show the movement.

I mainly focused on the use of the ptz_control on a “snapshot” image for faster updates. (i.e. camera_view: auto instead of live. I tried adding statements like self._async_get_image() after the ptz move, but that had no effect. The async nature of these threads is probably why I guess.

It will be interesting to see what sort of lag you get with your local PTZ camera.

When you’re not using the stream component, then the views update every 10 seconds I believe. I’m not sure there is a good way to make the view(s) update, or even if that’s an appropriate thing to do in a camera platform.

Ok, then that’s just the way it is. Moving to a PT preset behaves the same way, right? I’ve already included in the updated Amcrest doc to expect a lag of “several seconds”… maybe that should be refined a bit more after you have tested it.

For what it’s worth, I just saw in the .107 release notes that the updated onvif integration has some advanced PTZ movement capability. See ONVIF doc.

I’m not sure if this could co-exist with the Amcrest component when connected to the same cameras, but it would be interesting to see if it experiences the same kind of video refresh lag when moving the camera.

In general it’s not a good idea to share a camera’s control between multiple integrations or systems, although if one knows what they’re doing some things are possible.

Regarding the lag, it has nothing to do with the camera platform. The lag is due to the stream component. The camera platform only provides the appropriate RTSP link to the stream component.

Hi

Is there any chance we will be getter support for audio events into the home assistant amcrest component.

You mean when the camera detects sound? That should be relatively easy (I think.) I can certainly look into it.

I experimented with one of my cameras that has a microphone, and sure enough, it looks like if “Intensity Change” is enabled under Audio Detection, when the threshold is crossed it does send the AudioMutation event code. So I could add something like audio_detected as another monitored binary sensor choice. It’s basically just adding a table entry, and updating the docs. I’ll add this to my list of to-do’s.

Thanks.

My camera is the same, enabling audio detection only, doesn’t seem to work unless you also do audio intensity change as well.

It sends both AudioAnomaly and AudioMutation events, but on my camera only the AudioMutation events seem to be reliable for audio detection and the AudioMutation event requires intensity change to be activated.

PR submitted:

2 Likes

Thank you!

(Currently i’m using a work around user a docker script that posts camera events to mqtt via the http api).