I have used the excellent @ReneTode examples to make my first python app. It contained just six pieces of data for which it was not a problem to pass them through as attributes in the self.set_state()
method.
Now I have made an HTML parser for the Hard Disk Sentinel XML report which has about 50 parameters. I have formated them nicely as a json.dumps()
. The list is of variable length +/-6 for each additional hard disk. I would like to be able to keep them connected and organized during the transport.
Is there a way to send them as JSON to HA, and group them visually in Lovelace, or is sending each one individually as an attribute really the only way? I wouldn’t either want 50 sensors for all that data, but it would be nice to have a sensor with attributes for each disk and a sensor with attributes for system data.
HD Sentinel is a excellent little tool for windows that predicts the failure of spinning drives on windows machines, and I use it on all of mine. It publishes a report on its webserver and I have made a scraper that collects vital data from that report. The idea is to use one instance for each computer on the network.
Currently, it looks like this
import appdaemon.plugins.hass.hassapi as hass
import time
import requests
import urllib.request
import re
import json
from datetime import datetime, timedelta
from socket import timeout
from bs4 import BeautifulSoup
class HDS_192_168_0_3(hass.Hass):
def initialize(self):
self.get_values(self)
def get_values(self, kwargs):
self.url = 'http://192.168.0.3:61220/xml'
self.sensorname = "sensor.hds_192_168_0_3"
self.friendly_name = "HD Sentinel IP: 192.168.0.3"
try:
response = requests.get(self.url, timeout=10)
except:
self.log("Servis nije dostupan")
return
soup = BeautifulSoup(response.text, 'html.parser')
sentinel_ver = soup.hard_disk_sentinel.application_information.installed_version.text
comp_info = soup.hard_disk_sentinel.computer_information
comp_name = comp_info.computer_name.text
uptime = comp_info.system_uptime.text
up_since = comp_info.system_up_since.text
sys_info = soup.hard_disk_sentinel.system_information
win_version = sys_info.windows_version.text
memory = sys_info.physical_memory_size.text
graphics = sys_info.display_adapter.text
disks = soup.hard_disk_sentinel
diskcount = 0
diskdata = []
for disk in disks.find_all(re.compile("^physical_disk_information_disk..")):
diskdata.append(disk.hard_disk_model_id.text)
diskdata.append(disk.total_size.text)
diskdata.append(disk.logical_drive_s.text)
diskdata.append(disk.current_temperature.text)
diskdata.append(disk.power_on_time.text)
diskdata.append(disk.health.text)
diskdata.append(disk.performance.text)
diskcount += 1
diskinfo = {}
i = 0
j = 0
for i in range (diskcount) :
diskname = "Disk "+str(i)
diskinfo [diskname] = {"Model ID": diskdata[(0+i+j)],
"Total size": diskdata[(1+i+j)],
"Logical drives": diskdata[(2+i+j)],
"Current temperature": diskdata[(3+i+j)],
"Power-on time": diskdata[(4+i+j)],
"Health": diskdata[(5+i+j)],
"Performance": diskdata[(6+i+j)]}
i += 1
j += 6
full_info = {}
full_info = {"Sentinel version": sentinel_ver,
"Computer name": comp_name,
"Uptime": uptime,
"Up since": up_since,
"Windows version": win_version,
"Memory": memory,
"Graphics": graphics,
"Disks": diskinfo}
#t = type (comp_info)
#v = (sentinel_ver, comp_name, uptime, up_since, win_version, memory, graphics)
#print (diskcount)
#print (diskdata)
#print (diskinfo)
#print (full_info)
#print (diskname)
# print (json.dumps(full_info, indent=2, default=str))
jsonout = json.dumps(full_info, indent=2, default=str)
timenow = datetime.now()
self.set_state(self.sensorname, state = timenow.strftime('%d.%m.%Y %H:%M:%S'), attributes = {"HD Sentinel Report": full_info})
update_time = timenow + timedelta(minutes = 1)
self.run_at(self.get_values, update_time)