Custom Component: Unifi Protect

Hi @VinistoisR
Unfortunately I cannot help here. I think we need someone with mobile skills to help out.

I’m familiar with mobile, whatever @VinistoisR is trying to do is what I’d like to do as well.

How are you jumping directly to the specific camera view in HA? I can only figure out how to go to a “dashboard” in HA, not the actual live view. I think this is a limitation.

However, for jumping directly to Unifi Protect I think they have some protected adb intents. Jumping directly to a camera activity seems to result in a permission denial / security exception

./adb.exe shell am start -n com.ubnt.unifi.protect/com.ubnt.activities.timelapse.CameraActivity


Starting: Intent { cmp=com.ubnt.unifi.protect/com.ubnt.activities.timelapse.CameraActivity }

Exception occurred while executing 'start':
java.lang.SecurityException: Permission Denial: starting Intent { flg=0x10000000 cmp=com.ubnt.unifi.protect/com.ubnt.activities.timelapse.CameraActivity } from null (pid=15243, uid=2000) not exported from uid 10340
        at com.android.server.wm.ActivityStackSupervisor.checkStartAnyActivityPermission(ActivityStackSupervisor.java:1043)
        at com.android.server.wm.ActivityStarter.executeRequest(ActivityStarter.java:999)
        at com.android.server.wm.ActivityStarter.execute(ActivityStarter.java:669)
        at com.android.server.wm.ActivityTaskManagerService.startActivityAsUser(ActivityTaskManagerService.java:1096)
        at com.android.server.wm.ActivityTaskManagerService.startActivityAsUser(ActivityTaskManagerService.java:1068)
        at com.android.server.am.ActivityManagerService.startActivityAsUserWithFeature(ActivityManagerService.java:3662)
        at com.android.server.am.ActivityManagerShellCommand.runStartActivity(ActivityManagerShellCommand.java:544)
        at com.android.server.am.ActivityManagerShellCommand.onCommand(ActivityManagerShellCommand.java:186)
        at android.os.BasicShellCommandHandler.exec(BasicShellCommandHandler.java:98)
        at android.os.ShellCommand.exec(ShellCommand.java:44)
        at com.android.server.am.ActivityManagerService.onShellCommand(ActivityManagerService.java:10505)
        at android.os.Binder.shellCommand(Binder.java:929)
        at android.os.Binder.onTransact(Binder.java:813)
        at android.app.IActivityManager$Stub.onTransact(IActivityManager.java:5053)
        at com.android.server.am.ActivityManagerService.onTransact(ActivityManagerService.java:2867)
        at android.os.Binder.execTransactInternal(Binder.java:1159)
        at android.os.Binder.execTransact(Binder.java:1123)

However, when issuing

./adb.exe shell am start -n com.ubnt.unifi.protect/com.ubnt.sections.splash.SplashActivity

Starting: Intent { cmp=com.ubnt.unifi.protect/com.ubnt.sections.splash.SplashActivity }

This results in opening the Unifi Protect app’s last activity, which if the last thing you were on was the main activity, or a live camera view, it will be that.

Sending this as a HA notification would be the following, but it appears that you can lose the specific activity addition and just call on the app as the URI (https://companion.home-assistant.io/docs/notifications/actionable-notifications#android-example)

service: notify.mobile_app_pixel_3
data:
  message: 'test'
  data:
    actions:
      - action: "URI"
        title: "Open Unifi"
        uri: 'app://com.ubnt.unifi.protect'

According to the HA docs, actionable notifications only show two examples: URI / URL

However, it appears if you change the action using the following example, you can achieve specific intents to launch into apps with: https://companion.home-assistant.io/docs/notifications/notification-commands/#broadcast-intent

automation:
  - alias: Send broadcast intent
    trigger:
      ...
    action:
      service: notify.mobile_app_<your_device_id_here>
      data:
        message: "command_broadcast_intent"
        title: "action"
        data:
          channel: "com.ubnt.unifi.protect"

I’ve tried executing the above action FROM the notification shade via the android app, but I am afraid the android app doesn’t support this kind of response. If you wanted a specific intent, you’d probably have to send an action to your HA instance, which then would reply via an automation. However, I couldn’t get this to work as I think you need to be part of the correct “category” for the action, which the HA android app doesn’t allow doing via actions at the moment.

service: notify.mobile_app_pixel_3
data:
  message: command_broadcast_intent
  title: android.intent.action.MAIN
  data:
    channel: "com.ubnt.unifi.protect/com.ubnt.sections.splash.SplashActivity"

Spent a few hours looking at this for fun this morning. Hope this is helpful

1 Like

Hi @VinistoisR - can you post your automation for doing this? Sounds useful! Thanks!

FWIW, here’s what you’d be looking for:

- alias: ALERT (AWAY) - Motion at Front Door, notify Mobile App
  mode: single # Only run once at a time  
  max_exceeded: silent # Hide warnings when triggered while in delay.
  trigger:
    platform: state
    entity_id:
     - binary_sensor.front_outdoor_motion_sensor
    to: 'on'
  condition:
    - condition: state
      entity_id: variable.family_status
      state: 'Away'
  action:
    - service: notify.mobile_app_pixel_3
      data:
        title: "Front Door Motion"
        message: ""
        data:
          image: "/api/camera_proxy/camera.front_door" #Captures image immediately, sends via notification
          ttl: 0
          priority: high
          clickAction: ""
          actions:
          - action: "URI"
            title: "Open Unifi"
            uri: 'app://com.ubnt.unifi.protect'
          - action: "URI"
            title: "Open HASS"
            uri: '/lovelace/camera_view'
2 Likes

@briis Do you mind explaining what exactly is

event_score

and

event_length

is? I assume event_length is the length of time motion was detected, but what is event_score?

I have SO many false positives, and it seems like these might help out. It’s so annoying. Not sure if it’d be worth adding to the github repo. Looked around there but couldn’t find an explanation.

The Event Score is a number Protect assigns to a Motion or Smart event. It is a number between 0 and 100, and without knowing the algorithm behind it, it seems that the higher the number, the more certain you are of a ‘real’ motion event occurring. So if you have false positives, you could try and test for the event_score attribute, maybe only act on numbers over 70 as a starting point. Please note that score normally only is set, when the event has ended.

And event_length, is the length in seconds of the last occurred motion event.

1 Like

I would love to be able to set the “custom text” of the doorbel, but I think that still not part of the API or is it?

Hi Greg,
Oh yes, this is possible and has been for a while. You can use the following service:

service: unifiprotect.set_doorbell_lcd_message
data:
  entity_id: camera.doorbell
  message: Ring the bell
  duration: 15

duration is optional, if you leave it out, the message will stay.

1 Like

SOMG COOL! Directly set actions for my notifications!

1 Like

Has anyone found a way to deep link from Home Assistant to a specific camera in the Unifi Protect app?

I read that both @marcelod and @dts asked about it before, did you ever manage to solve this?

I had no success, sadly. Tried looking into the iOS app bundle but the only URL in CFBundleURLSchemes is unifi-protect:// which just launches the app. I don’t think Ubiquiti have implemented deep linking to specific screens of the app by the looks of it.

What I don’t understand though is that if you enable push notifications for motion events, the notification banners do link straight to the motion event in the timeline view, so there must be a URI scheme to make this work.

Maybe someone with more iOS app expertise can chime in but I couldn’t get it to work by just trial and error.

@briis this is an amazing piece of engineering making Unifi even more useful so firstly I would like to say thanks :pray:

I am new to Home Assistant and I am struggling to know how I can create an automation that would allow me to say my Alexa “Stream kids room camera” and have it then load that camera on that display. I know this is probably a noob question bt I hope someone can point me in the right direction :grin:

Thank you @anixan, appreciate it.

I am not an Alexa user, so I have no experience with that, but I do believe there is a Custom Integration for an Alexa Media Player, that might get you some of the way. Hopefully someone else in this fantastic community has already done, what you want to do, and can chip in.

So I have been enjoying this massivle the 2 months that I have used it, had no clue when I got all the Unify stuff that all this would be possible. Big thanks!

Two questions that someone might have the answer to:

  1. Sometimes when I get a notification I get that the obejct was unidentified even if Unifi Protext states it was a person/vehicle. Any ideas on this?

  2. I would like to get actionable notifications (clickable text at the bottom of the notification below the image) with the snapshot from the camera and then options in the notification to perform different tasks (pause automation/notification, pause the camera and so on). But when I add actions to the notification the snapshot is cropped so I can’t see the full image. Has anyone solved this in a clever way?

Can you post your automation?

Sure! It is a bit random but I have seen when looking under Developer Tools that the motion detection event contains the motion type (Person/Vehicle) but that it sometimes isn’t sent through to the notification.

- id: '1613832203445'
  alias: KAMERA - G4 Altanen
  description: Notis med bild
  trigger:
  - entity_id: binary_sensor.motion_g4_altanen
    platform: state
    to: 'on'
    from: 'off'
  condition:
  - condition: state
    entity_id: binary_sensor.g4anotification_template
    state: 'on'
  action:
  - variables:
      time_stamp: G4A-{{ now().strftime('%Y-%m-%d-%H%M') }}
  - delay:
      hours: 0
      minutes: 0
      seconds: 1
      milliseconds: 0
  - service: camera.snapshot
    data:
      entity_id: camera.g4_altanen
      filename: /media/cameras/G4-Altanen/{{ time_stamp }}.png
    entity_id: camera.g4_altanen
  - service: notify.johans_mobil
    data:
      message: '{{ state_attr("binary_sensor.motion_g4_altanen", "event_object")|replace("vehicle",
        "Bil")|replace("None Identified", "Oidentifierad rörelse")  | capitalize}}
        vid altanen kl. {{ now().strftime("%H:%M") }}. '
      data:
        image: /media/local/cameras/G4-Altanen/{{ time_stamp }}.png
        clickAction: app://com.ubnt.unifi.protect
        ttl: 0
        priority: high
  mode: single
  max: 10

I suspect that you may find that the difference in object identification is down to post-processing performed by Protect after the initial detection.

My automations are triggered after the motion_detection sensor has reverted to ‘off’ as I use event_score and event_length attributes to provide more granular filtering of events than Protect can currently provide. It also lets me grab the thumbnail created by Protect for the notification.

This obviously isn’t to everyone’s taste as events are reported after they have occurred, however my reported object detections always match those of Protect - presumably because of the delay.

Linking back to the last time I posted the automation to avoid cluttering the thread.

I should note that I don’t currently get vehicle smart detections with this camera as vehicles only tend to appear end-on to the camera (usually as they’re damaging the property :grimacing:)

Thank you for your answer. Sounds like that definately could be the issus. I remember your post and your automation, it was just as massive as itwas impressive. I’ll dig into it and try to break it down and see if I can find a few bits and pieces to use in my own.

I can recommend using variables to simplify future changes to the automation - they make building a notification much easier amongst other things, e.g. breaking out the detection type.

    - variables:
        motion_type: "{{ 'Motion' if is_state_attr('binary_sensor.motion_driveway', 'event_object', 'None Identified') else (state_attr('binary_sensor.motion_driveway', 'event_object') | capitalize) }}"

1 Like

Thanks, @briis :slight_smile:

Can anyone else in the community here offer any advice on how to integrate this component with Alexa?