The instructions on this HACS implementation is very good. It looks like a lot of work, but if you follow the instructions step by step you would be fine. But as you say you are new to HA itself. Have you got any experience with HACS and set that up already? If not start there.
I myself made automations that if the bell rings speakers announce it (depending on time etc where in the house) and shows as a message on my tv. Also have flashing hue lights setup, but thats based on eufy’s contact sensors (to let me know when my kids come home from school and I am upstairs). Its pretty straight forward with this integration. But in my case I use Nodered instead if the native automations in HA. So
Unless tou use Nodered my examples would be useless.
where are you going to use this exactly? Inside home assistant, just using camera.name would work.
Moreover, to have thumbnails, you should have started streaming once after latest restart. currently, only way to generate thumbnails is to start streaming.
I’m using it as a thumbnail with the iOS notifcation. The snipped is from a NodeRed Function Node. It seems to send a thumbnail image now attached to the notification. Which seems to be the last image it stored.
Regarding images, it is working for cameras that are streaming all the time and for cameras that are streaming only on motion detection there is no image. Is ther a way to fetch always the last image, like eufy mobile app is doing?
Work perfectly thanks - and also the thumbs are back once the camera is activated. I use the webrtc conditional cards; noticed that when starting an rtsp stream the status moves from idle to preparing to streaming; whereas p2p moves from idle to streaming - which one do you use yourself @anon63427907 ? And what’s the purpose of preparing seeing it already streams? To solve it I condition “not equal to StreamingStatus.IDLE”
Ok tnx, but slow webrtc and no audio is a show stopper for me. Generic camera is working without any issue so will wait for webrtc v3 maybe it will be without issues.
there is no point of using this integration for streaming purposes, if you camera can stream continously. stick with generic camera and install webrtc 3.0 integration (using go2rtc under the hood) or go2rtc add-on. You get benefit of sensors and other stuff via this integration.
if camera is always on and able to stream with RTSP, it should instantly jump idle-preparing-streaming.
if camera is battery operated, after sending start RTSP command, it takes couple of seconds to stream start, so we are consuming preparing time there.
if camera only supports P2P, integration internally starts the stream, so it is faster to jump between idle-preparing-streaming.
for conditional card, I am using streaming and idle conditions all seems to be working fine
this integration is just generating a RTSP url for you (either using P2P streaming or existing RTSP capabilities), you can have PTZ functions from here and use generic camera for continuous streaming devices. for on demand streaming cameras, get everything from here. These are not mutually exclusive, just mix and match.
example:
type: custom:webrtc-camera
entity: camera.generic_camera_name
ptz:
service: eufy_security.ptz
data_left:
entity_id: camera.eufy_security_camera_name
direction: LEFT
data_right:
entity_id: camera.eufy_security_camera_name
direction: RIGHT
data_up:
entity_id: camera.eufy_security_camera_name
direction: UP
data_down:
entity_id: camera.eufy_security_camera_name
direction: DOWN
Absolutely - my rtsp cams feed into frigate; frigate also generates thumbnails. Do you still use the custom webrtc integration or did you move to a native ha one?