When I watch the stream through HA it just shows a delay of +/- 2 seconds… This is my automations that starts with a delay to compensate the delay of the stream:
# Send an iOS notification with a snapshot of someone who is in front of the door and pushed the doorbell button
- alias: ios snapshot doorbell
trigger:
- entity_id: binary_sensor.doorbell_button
platform: state
from: 'off'
to: 'on'
action:
- delay: 00:00:06
- service: camera.snapshot
data_template:
entity_id: camera.voordeur_camera_deurbel
filename: '/config/www/snapshots/deurbel_{{ now ().day }}_{{ now ().month }}_{{ now ().hour }}_{{ now ().minute }}.jpg'
- delay: 00:00:01
- service: notify.ios_iphone_van_martijn
data_template:
title: Deurbel
message: Er stond om {{now().strftime('%H:%M')}} iemand voor de deur!
data:
attachment:
content-type: jpg
url: 'https://myduckdnsaddres.duckdns.org/local/snapshots/deurbel_{{ now ().day }}_{{ now ().month }}_{{ now ().hour }}_{{ now ().minute }}.jpg'
hide-thumbnail: false
- delay: 00:00:01
- service: camera.local_file_update_file_path
data_template:
entity_id: camera.local_file
file_path: '/config/www/snapshots/deurbel_{{ now ().day }}_{{ now ().month }}_{{ now ().hour }}_{{ now ().minute }}.jpg'
Here is an example using a SONOFF 433 RF PIR, but you could also use the onboard PIR (if you have it installed) No option to use software motion detection like with motionEye for example.
I’m guessing motionEye is not possible given the lack of RTSP stream and external access, though the stream should be possible though the token is a concern. As nice as it would be to have the cameras all tied together under motion eye…
Given the slow framerate, how do people handle capturing? FWIW I suppose node-red or similar dumping during times of activity would suffice, though again motionEye add-on could probably do the grunt work for you without a PIR.
I’ve discussed this with Ludeus and its probably not meant to be exposed in that manner, although you need to be logged in to view the stream. It would also not work stand alone (when logged out) nor will it with HTTPS, HTTP does work (adding it to motioneye that is) for now.
Using the default arduino camera webserver sketch also doesnt work for MotionEye, it times out trying to add it.
I really want but dont have an other option to try to get this setup in MotionEye, i’d be glad if anyone has an idea?!
Guys, the ESP32 cam does work with MotionEye and as standalone IP cam. I tested it. There are different codes to program it. If you flash it through ESPHome i.e. inside Home Assistant, there will be no IP access, it will just work in picture card in HA. If you flash it through Arduino default sketch, it will work as IP cam but not working with MotionEye. If you follow instructions on following link, it will work as IP cam and can also work with MotionEye inside HA. Few points to note. Cam resolution should be carefully chosen. On maximum resolution, it wont work long before heating up too high. So I am thinking to use it on max resolution only as snapshot taker like on movement take pic or stream and then go to deep sleep, though not tested this part yet.
Does anyone know why my image is rotated 90 degrees?
At least I assume it is and that the board is supposed to be orientated with the long side vertically.
This was helpful, but I’m curious why using the below doesn’t allow control of the brightness. Does anyone have a working example of controlling the brightness on the led? It’s crazy bright as just an on/off switch.
So I’ve noticed 2 other people asked a similar question in this thread, but let me get to it.
I’d like to use my ESP32 CAM on the NVR software “iSpy”, and have HA simply just display a live view of the ESP cam (which I already have set up).
I’ve been looking into how to enable an http web server on the ESP firmware that simply displays the stream kind of like on the example camera code in the Arduino IDE. If I am able to get the stream showing on a web server, I am confident I can get the camera working in iSpy. I am aware of the stream link generated in home assistant, but I would rather not convolute the stream by passing it through HA first; seems like a waste of CPU.
Like @Oleksii_Zelivianskyi stated I use the code in the link he provided and it works great with motion eye (running as a hassio addon) which then again trunks the stream in to home assistant. It gives me motion detection and a time lag of about 5 seconds.
As @nickrout stated this is not (yet) possible to do with esphome.
cam 1 is esphomatized and has a pir sensor added. Beside that the onboard led is integrated as switch. cam 2 is powered by the sketch from randomnerdtutorial and feed into motion eye. Their I have motion running over the stream and once again trunk the stream into ha from their.
I have switched over to that, but unfortunately, my NVR and HA cannot use the stream at the same time. I think I will allow iSpy to access the stream, and then I can import the stream from iSpy’s stream feature outlined and described here:
My computer running iSpy is way overpowered, so using CPU on it is a non-issue.
Pretty sure this will work, will find out later tonight.
And set the board to this
esphome:
name: ${devicename}
platform: ESP32
board: esp-wrover-kit
I did find out powering this from the ftdi adapter or through a breadboard doesn’t work and give a brownout error.
Powering it with 5v works like a charm
And it works better. With esphome, my esp32-cam stopped reporting in a day or two and was more laggy. With this firmware and motioneye it is stable for 2 weeks now, Raspberry Pi 3 processor load is about 15-20%. MotionEye records video to separate NAS.
The only problem is - I don’t know yet how to add a button trigger to this firmware.
And I am wondering, maybe it will be better to have a separate installation of MotionEye on another raspberry? Or on the same raspberry but separate from HA (in Docker)?