Person or object detection for Synology Surveillance Station and Home Assistant

Hi all, I thought I’d introduce my new project SynoAI.

SynoAI acts as a Broker between Synology Surveillance Station and Deepstack (and potentially others in the future) and sends notifications if the confidence from the AI for detecting defined objects exceeds the threshold on a camera-by-camera basis.

Surveillance Station was causing too many false alerts, so I really wanted to limit it to just person and car detection. Other integrations I tried never seemed to work very well on my pi; doing it this way meant I could utilise the processing power of my vastly underutilised NAS.

As one of the notification types I wrote is for a Webhook, it mean it was therefor really easy to integrate with Home Assistant’s camera Push integration. This means I’m able to turn on my front lights now when a person or car is detected on a driveway. And I can trigger alarms when I detect people (and not cats) in the back garden.

I put this up on the Synology Reddit page and had some good feedback, so I thought I’d post it here too in case it helps someone!



Just wanted to say that I have integrated SynoAI into my HAS setup and it works really great!

I am actually running 2 SynoAI docker images on my Synology. One that is monitoring all motion and one that is monitoring line crossing on the driveway. The second one is to trigger different alarms and to only light up the driveway for persons and not cats etc :slight_smile:

The only thing I still need to figure out is to capture the identified object in HAS so I can specify the automations even further.

Thanks a lot for your efforts!

I just wanna thank you so much for this. :wink:

While there is the very popular robmarkcole/HASS-Deepstack-object integration, SynoAI is so much faster.

Instead of using a Synology Surveillance Station action rule as a trigger for motion, I call the SynoAI webhook from a http node in Node-Red/HomeAssistant.
This way I can decide yourself from HA when I want to detect objects.

To receive the results I added a Node-Red-Companion webhook node in Node-Red and a webhook notification section to the SynoAI config.

This has a few advantages over the HASS-Deepstack-object from robmarkcole:

  • It’s faster (Running the docker containers in a VM on my Proxmox NUC): takes about 5s to get a result from HASS-Deepstack-object and about 1.5s from SynoAI.
    This is a huge difference when checking moving objects like persons walking around.
    Both are using the same hardware and the same Deepstack Docker instance.
    So there seems quite a bit overhead in the HA solution.

  • With SynoAI you get the detailed coordinates for all detected objects in one message, while you have to listen for multiple events in HASS-Deepstack-object.
    Not everybody needs the coordinates for the objects, but you can automate things even more if you can watch for cars at a specific parking space.
    Or check if a person is closer to your house or at a specific place of your frontyard.

Just bought you a few coffees on your GIthub link as a thank you. :slight_smile:

1 Like

Just got this setup and its running fantastic, webhook to node red is incredibly fast.

One question though, I set this up with Deepstack, but I could have used CodeProject.AI as you have included that functionality. Blue Iris has migrated to Code Project. Which one in your experience is the better option?

Edit: I installed and it’s performance is about the same as deepstack on my DS918+ uses way more CPU all the time and double to tripple the RAM. I have upgraded to 16GB of RAM so that part doesn’t bother me so much. But the CPU load is a little annoying as it doesn’t seem to perform faster. Is there a reason things seem to be moving towards codeproject vs deepstack? Maybe if we can get a TPU working on a Syno NAS with codeproject? One can only dream…

I have successfully integrated SynoAI into HASS via Push Camera. The camera snapshot is shown to me in the media.
Where is the location of the snapshot?
I programmed an alarm system in HASS and would then want to forward the snapshot with a corresponding rule.