I am new to HA. I have installed hass.io in a VM on Windows Server and everything is running. I have ssh access to the Home Assistant shell. I’m a little confused with hass.io/supevisor/ha itself and this is the first time I’ve seen yaml.
I would like to bring in some air quality information as a sensor into HA. I have a local station that publishes the raw pollutant data and I have written a short Python script to scrape it and process it to calculate some indices (it’s a piecewise linear function with some choice of maximal params). I can output this to the shell or a text file, or whatever is best. One thing to note is that I have multiple data from each scrape (levels of different pollutants) and I would like to feed them to different sensors (attributes?) in HA.
I would like to understand a hassionic way of bringing this in as a sensor. I can see a few scenarios:
Scrape directly in HA through the scrape sensor and replicate the mathematical operations in HA. Not sure if this is possible/appropriate in YAML. Is there some better method?
use the command_line sensor to run the python script and get its output. I don’t see how to get multiple sensors out without running the scrape multiple times and filtering the output differently for each sensor. Might get annoying for the data server.
Run the python script as a cron job every 15 mins, outputting to a text file and use the file as a data source for the sensor. Where would I set up cron? It doesn’t seem available in the HA shell, Maybe it needs to be at a lower level in the VMs? How do I access it?
Then there are some things in HACS called appdaemon and netdaemon: may this is a better way of achieving what I want? Haven’t investigated that part of HA yet.
But I may be barking up the wrong tree completely. Advice much appreciated!