I am not a big fan of looping scripts that stay in the memory all the time and if they hung for some reason they can’t recover by them-self. In its early stages my script was also a looping one but then I found its much more stable and controllable if I just run it with a cronjob when I need it.
I also tried to use as less dependencies as possible - most of my checks are done with bash commands and the only additional python module I use it the paho.mqtt.client.
This is just a personal opinion based on my experience maybe there are advantages in using a looping script and not a cron or using python modules than bash commands. I’ll be happy if some body with more experience and deeper python understanding can comment on this.
In my (limited) experience I noticed that multiple accesses to mqtt server were taking more time and resources than just waiting for the next publish time…
Me too
I was also thinking about this and plan to test it - my mqtt server which is hosted on rpi 0 is getting around 40-45 messages per min and is getting overloaded.Plan to add an option to send all data as a group message and if this will reduce the load on the mqtt server.
I have updated the script to send one json message but can’t read the separate values - the payload example is in the post description and I believe this is the error - thus my template is different - but I don’t have a sensor with template matching the error:
Log Details (ERROR)
Logger: homeassistant.helpers.template
Source: helpers/template.py:284
First occurred: 4:33:43 PM (78 occurrences)
Last logged: 4:53:21 PM
Error parsing value: 'value_json' is undefined (value: None, template: {{ value_json.status == 'enabled' }})
If someone can help with this or suggest payload format easier to parse in hass
thanks
UPDATE: I made it with CSV - will publish it shortly - this way even the message is shorter
If you publish the payload like this:
{"used_space": 25, "sys_clock_speed": 1500, "cpu_temp": 43.0, "voltage": 0.8500, "cpu_load": 1.25, "memory": "False", "swap": "False"}
then an MQTT Sensor configured like this:
- platform: mqtt
name: "pi monitor"
state_topic: "test"
value_template: "{{ value_json.cpu_load }}"
json_attributes_topic: "test"
json_attributes_template: >
{ "used_space": {{value_json.used_space}},
"sys_clock_speed": {{value_json.sys_clock_speed}},
"cpu_temp": {{value_json.cpu_temp}},
"voltage": {{value_json.voltage}},
"memory": "{{value_json.memory}}",
"swap": "{{value_json.swap}}" }
will result in this:
Thanks @123 I already made it with CSV message - this way the message looks unreadable but is smaller and the main goal for me was to reduce the load on the MQTT broker so I’m happy with the result.
I’ll refer to your comment next time I have problems with parsing JSON in Hass.
To be honest I am still not sure what was wrong with my setup - as my template looks like yours - only using single brackets instead of double - can this be the issue?
The new version of my script is published on github for anybody interested - I still don’t know if this will reduce the load on my MQTT broker - I also plan to put a random delay before sending the message so I can de-synchronize the messages - now all my 8 raspberries shoot at the MQTT broker at the exact same second and its not cool for him.
What kind of sensor are you using in Home Assistant to read and extract data from the CSV?
The mosquito MQTT Broker is efficient. Check the CPU load of the machine hosting mosquitto to see if “loading” is as excessive as you think.
I am using mqtt sensors like this:
- platform: mqtt
name: 'rpi4 cpu load'
state_topic: 'masoko/rpi4'
value_template: '{{ value.split(",")[0] }}'
unit_of_measurement: "%"
I have checked the load on the machine hosting the mqtt broker and it reaches 100% when all 8 rpis start sending each 7 messages a the same moment - that is why I grouped the messages in one csv message.
My mqtt broker is a rpi 0w - its not a very powerful board and these 56 messages send at once were giving it hard time + there are other messages from temperature and humidity sensors and motion sensors
My biggest problem was that the automatons triggered by motion were delayed. The screenshot above is not with all 56 messages - I have disabled some that I don’t need. Maybe my mqtt broker is not configured or optimized properly - I am using a default installation but it works.
The JSON payload’s few extra bytes, versus CSV, isn’t the issue. The RP0 simply isn’t the optimal platform for the level of ‘burst traffic’ you’re generating.
Plus, the CSV format eliminates the possibility of using Home Assistant’s native ability to parse JSON. Your original payload wasn’t valid JSON because it employed single-quotes.
Invalid:
Valid:
Thank you for quick response and explanation.
I composed it with python json module and validated with http://jsonviewer.stack.hu/ all seemed OK and I had no doubts in the json formatting.
I know the rpi0 is not the best hardware but I think I’ll still make it work with combining the messages in one and de-synchronizing the traffic so its not burst.
I’ll go with the CSV for now as I have reconfigured all my sensors in hass - I have to try the auto discovery method described by you above - it looks faster than updating my sensors.yaml manually.
Thanks again.
Home Assistant did.
Change the single-quotes to double-quotes and you’ll see Home Assistant has no trouble parsing the payload.
NOTE:
If I use this online validator (https://jsonformatter.curiousconcept.com/) it even auto-corrects the supplied JSON string by replacing the single-quotes with double-quotes:
Home Assistant knows more than me about json formatting for sure - and other things
I’ll bookmark the jsonformater and use it in the feature.
If people complain from the csv message I’ll switch back to json as it really looks better and is more readable.
Thank you again.
I use this script on my Ubuntu system and I can confirm it is working (except for temp, volt and speed of course, but I don’t care).
Thanks!
temp, volt and speed are rpi specific, I have now ubuntu on my laptop and when I have time I’ll see if i can make these work on ubuntu.
I found some code on this page which is working for temperature.
temp=subprocess.check_output("cat /sys/class/thermal/thermal_zone3/temp", shell=True)
temp = temp/1000
Is it possible to run this on a Pi running HASSOS, since the system is locked down version of Linux, or is a full Linux OS required? I ask because that is my assumption with the RPi-Reporter-MQTT2HA-Daemon Add-on in HACS. I haven’t explored this part of HA that much yet, but I undersdand that you really can’t install anything on HASSOS yourself.
I’m using the script but when my raspberry pi tries to connect to the home assistant MQTT server, it generates a socket error. Someone could help me?
1600865687: New connection from 192.168.0.30 on port 1883.
1600865687: New client connected from 192.168.0.30 as auto-DD906B4A-285B-7E2D-426F-03AF1CA76C09 (p2, c1, k60, u’diegolz83’).
1600865687: Socket error on client auto-DD906B4A-285B-7E2D-426F-03AF1CA76C09, disconnecting.
I have the same error like diegolz83…
I see all 5 values in mqtt explorer, but I don’t see anything in my HA. No entities.
For those getting errors on the broker end and messages not reaching hass, it’s due to the broker configuration. Yesterday I reinstalled my matt broker and had the same issue, it tired out to be the broker configuration.
Here is a configuration that works fine:
user mosquitto
max_queued_messages 200
message_size_limit 0
allow_zero_length_clientid true
allow_duplicate_messages false
listener 1883
autosave_interval 900
autosave_on_changes false
persistence true
persistence_file mosquitto.db
allow_anonymous false
password_file /etc/mosquitto/passwordfile
do you think it’s feasible to run in docker?