Run a python script unrelated to HA with HA

I’ve made a python script that scrapes some data and exports an xml file. It’s not related to HA at all, but I was thinking of using my server running HassOS to run it anyway. The generated xml file is to be used by another add-on running on that same server. What’s the way to go with this? I guess I need to run it like every 15 mins to update, and the output xml file needs to be easy to access by the other addon using it.

Your script may require some refactoring before it fits but it is a good investment

I don’t know much about the Hassos build, but if you can get to the core OS, then crontab?

crontab -e

Add line like:

*/15 * * * * /path/to/script

Will run it every fifteen minutes.

You can call any script with the shell_command integration and a Time pattern trigger in an automation.
This depends on the python modules available in the HA container.

Sound a bit like shooting a bird with a cannon, but maybe not? I’ve seen it and Pyscript mentioned, they both seem suited but a bit big and heavy.

EDIT: also takes some workarounds/adaptations, but i guess it’s not that much…

This is probably a version of the way to go I guess, but I think the ideal way is to do this in a custom Add-on. I’ve spent far more time than i’m willing to admit to get a simple Add-on to work, but I’m not able to locate the destination for my addon contents, nor able to map a workfolder to use something like /share/myaddon. Just logging into root and do the commands you suggest seems a bit hacky and potentially risking messing up something in the os.

Yeah, that sounds like a nice way of doing it, but i have a feeling that way wont allow me to import from xml.etree.cElementTree, requests and/or BeautifulSoup

EDIT: Holy shit that worked. Only question is how to specify output folder for the file generated. It’s currently saved in the config folder, not the same folder as the script…

EDIT2: It was as easy as specifying it in the script, instead of tree.write(output.xml, encoding='utf-8', xml_declaration=True), i added /share/ in front of the filename like so: tree.write(/share/output.xml, encoding='utf-8', xml_declaration=True)

1 Like

The only thing to keep in mind with this is your script is actually running in the HA container. That means you are only able to use the version of python HA is using, only able to import the packages HA depends on and must use the versions of those packages HA depends on. Granted that is a really long list so that may not be an issue but something to keep in mind.

If you do need more packages then HA provides or different versions of things then you can ssh from a shell command to somewhere else which has the dependencies you need and run the script there. The ssh addon itself usually works pretty well for this since it lets you specify additional packages to install as part of its config. Can also specify a startup script to add more if required.

Or just make an addon. Then you can bake the dependencies into the image and have the addon just run the script when started. You can start addons from automations with hassio.addon_start

Yes, as I said myself earlier in this thread, I think that an Add-on is the ideal way to do it. But as I also mentioned, I can’t seem to figure out how to make an Add-on as I want to. I’ve looked through the documentation several times and followed along. I can create the add-on, but I have no idea what content ends up where, nor am I able to dictate a location such as /share/myaddon/ or whatever. I’ve looked at other Add-ons at GitHub, but they all seem to be doing things differently and more complex than the tutorials for Hello World. I guess my lack of knowledge with bash and containers in general is holding me back.

I mean tbh if all your addon wants to do is run a python script and stop you could pretty much just copy the example addon. You’re basically just changing this line to something like this:

exec /usr/bin/python3 /location/of/python/

Also the example addon keeps restarting by design to show how a service works since most addons are services. If you want it to simply run a script and stop after the script completes then replace the finish script of that addon with the one from the Let’s encrypt addon.

Another tip - change the build.yaml to use the python base image instead of the normal addon base image since you need python. Once less dependency to add manually (plus it includes pip for you so you can use pip install in your Dockerfile).

Yea I mean just copy the example addon. It’s dumb script is here. Replace that with yours or put it somewhere else within the rootfs folder so it gets copied in. Then reference it like I showed above.

Don’t actually ask for a file in /share btw. I mean if its for personal use only then do whatever but that’s bad practice. The whole point of this exercise is to bake an image that “just works”. You can’t do that if the script isn’t part of the image. That script could try and import whatever and it won’t work if the dependencies aren’t included in the image or the wrong python version is in use or whatever.

That’s basically just a different variant of the problems I said above where you run the script in the HA container. You’re running a script in a box that wasn’t built for it, it might work, it might not.

Thanks, I will definitely look into that. I’ve never even considered it, but I guess rootfs is short for “root file system” for the container? From just reading your post, the first thing that comes to mind is:

  • How to configure the output file location. Either host it on a port within the container, (I know that python has a web server) or export it to e.g. /share/myexport. It wold be bad practice to hardcode a location into the python file(s).

  • How to make it run on a schedule.

This is gold, thanks again as I have several other ideas for Add-ons thhat I’ve given up on earlier :slight_smile:

Yea. It’s not a special name in docker or anything though, it’s just copied into / in the image here:

So you can call it whatever you like or use a different technique if you prefer. The point is just to copy your script into the image when it is being built so it can be referenced from the run script or used as CMD.