Blue Iris Integration Tutorial

Anyone having issue with BI lately? My core log is fill with with this crap

2021-03-07 16:28:16 ERROR (stream_worker) [libav.tcp] Connection to tcp://192.xxx.xxx.xxx failed: Operation timed out

Pretty much every couple second. From the log time stamp basically every 10 to 25 sec I got this error. So technically speaking I shouldn’t get any stream at all. However my live feed still seem to show up on lovelace fine. What going on???

Yes I 2nd on zeeshany comment. My BI pc also been running overtime these pass week. Usually is at 25 - 35% now it pretty much stay at 80-100% cpu. I’m thinking just kill the BI intergation for now and just get my feed from my cameras vs onvif. The delay on onvif is bad but man BI is just not playing nice with HA lately.

The original method MQTT method that I’d posted is still rock solid. No need to worry about unofficial integrations either. I wrote that a long while ago and still haven’t had any issues.

1 Like

This is not really an integration by the terms of what a Home Assistant Integration is. This is just a method to get a binary sensor using MQTT from Blue Iris, and to get a camera feed using the MPEG Camera option, using the Blue Iris feeds?
Yes, I can see all work solidly. A bit of a misleading title. That said, handy if you don’t have outdoor motion sensors or if you are using an AI to limit Blue Iris to only detecting certain objects. Example, using the AI Tool tutorial by GentlePumpkin to limit to only person detection.

I will still scour the web to see if anyone may have started a genuine Integration for Blue Iris, but setting up MQTT Binary Sensors is a good start. I prefer ONVIF to grab events though and that is an official integration :slight_smile:

If you find having your cameras in Home Assistant pegs your CPU on your BI machine, make sure you setup SUBSTREAMS and get your JPEG images from substreams and only grab the mainstream when you go to a single camera view. If you grab the main Hi Res feed for the JPEGs and have HA always grabbing your Hi Res (Main) steams, then your BI Machine will get hammered.

you mean like this?

2 Likes

I saw that after I posted :). Missed it earlier.
It works quite well. Would be nice to see it made into an official integration.

Had an issue with BlueIris not sending the MQTT command on restore of camera for BI 5. It’s a bug in Blue Iris and Ken (BI Support) is going to work on it.

Edit: 26/03/2021 as of the latest update for BI 5 (5.3.9.16) the MQTT issue for camera signal restored is fixed.

Guys i need to do in HA that set stream frame lower, because now when i open HA every time i see the camera on the HA app it streams and my mobile internet usage is increasing.

How can i do that and what should i change it? I will see the camera well via de Webrowser with full frame, change should be only to the HA lovelace.

Frame rate will be what is streamed out. You’d have to change that in Blue Iris or on your camera.

Hi All,

Without any idea on what might have changed, all of the sudden i can’t make a connection with Blue Iris. I’m able to connect via a browser to the webserver. But if I try to do the same in the configuration in Home Assistant integrations I keep gettting a “Invalid server details” error

details
host: 192.168.0.1
port: 81
use ssl: false

Tried uninstalling and reinstalling, didn’t help

1 Like

I am not able to get audio to work. Is there is a limitation in home assistant ?

Yes, no audio in HASS.

1 Like

Anyone here having issue with their mqtt sensor. All my motion sensor is not updating. I tried via manual add the sensor in hassio and also using BI integration in hassio 3rd party store. None of it works. My BI version is 5.4.3.8.

I don’t think is hass mqtt server end because I see BI successfully signing into my mqtt server log.

I need some assistance. I have no idea what I’m supposed to put into the Web Request or MQTT. If there are pictures in the original post to assist with that, I’m not seeing them. I haven’t used MQTT before, so I’m not sure what I’m supposed to put in there.

Alright, I am very confused. I have added the camera code to my configuration.yaml but have no idea where to put the lovelace card code… I tried creating a ‘manual component’ and copy pasting the code in but that didn’t work.

Also, why is this using mpg over ffmpg or RTSP?

Thanks

1 Like

How can i get the last clip to the HA which is recorded via motion detection? I need then play that last recorded clip via button on HA.

Are the original images at the top of the thread gone? I was hoping to see the mqtt configuration in BI.

Weird, I haven’t edited this post in quite a while. No idea why they disappeared. Let me takes some new ones and put them up.

I started using deepstack, how can I go about sending via mqtt what triggered the alert? ie Person, Car, Dog…

I guess I can do several alerts in blueiris for each item:

blue_iris/binary_sensor/drv_motion/state/person
blue_iris/binary_sensor/drv_motion/state/car
blue_iris/binary_sensor/drv_motion/state/dog

Ok got it to work…

How did you

In hassio you need to create an mqtt on alert for each item you want to track: person, car, cat…

Then in nodered you need to store the snapshot in hassio:

In hassio you’ll create an entity for each of these:

- platform: mqtt
    name: "Kun Person"
    state_topic: blue_iris/binary_sensor/kun_motion/state/person
    payload_on: "ON"
    payload_off: "OFF"
    device_class: motion

and these automations (you need to reset the entity state to OFF after trigger):

automation:
  - alias: Reset Binary Sensor Person KUN
    initial_state: true
    trigger:
      - platform: state
        entity_id: binary_sensor.kun_person
        to: "on"
    action:
      - delay: 1:00
      - service: mqtt.publish
        data_template:
          topic: "blue_iris/binary_sensor/kun_motion/state/person"
          payload_template: "OFF"

An exampme to make alexa say there is a person when someone is detected:

Automation:

- alias: Kun Person Detection
  initial_state: true
  trigger:
    - platform: state
      entity_id: binary_sensor.kun_person
      from: "off"
      to: "on"
  action:
    - service: homeassistant.turn_on
      data:
        entity_id: script.kun_tts_person

Script:

kun_tts_person:
  alias: Kun tts Person
  sequence:
    - service: media_player.volume_set
      data:
        entity_id: media_player.echo_living_room_2
        volume_level: "0.8"
    - service: notify.alexa_media
      data:
        target: media_player.echo_living_room_2
        message: A person was detected at the door
        data:
          type: tts

Lovelace view:

cards:
  - type: picture-glance
    entities:
      - entity: camera.drv_person_image
        icon: mdi:walk
      - entity: camera.drv_cat_image
        icon: mdi:cat
      - entity: camera.drv_dog_image
        icon: mdi:dog
      - entity: camera.drv_car_image
        icon: mdi:car
    camera_image: camera.drv

This will show a live feed of the camera and icons where you can see the latest triggers by identification (car, person, dog, etc).

4 Likes