Did you try listing it in allowlist_external_dirs? I haven’t really tried to send files with a shell_command so idk if that option impacts it but it does say it whitelists folders you intend to send files from via integrations.
Also, not sure what your script/setup looks like but other checklist items to be aware of that create challenges with shell_commands:
Can’t expand ~ so make sure you are listing the full path to any files you access
You cannot sequence multiple commands using piping characters, it won’t work in a shell command
Also specifically for shell commands which ssh or scp - don’t store any files you need in /root/.ssh since the home directory does not persist over updates, anything in there will be wiped out. That includes your generated key files and the known hosts file.
Thank you for your answer! Yes, I got a lot of issues while trying to make it run. I found out that I had to move my keys away from my root dir as you mentioned also.
Ok so key thing to note here, you are running your script from the SSH add-on. That’s actually a completely different docker container then what HA runs in. Which means when HA runs your shell command it sees a totally different file system then what you see.
Now obviously there’s a lot of overlap since the same files are remapped in a lot of the containers. Like clearly both of them can see /config and the share and ssl folders are available to all addons that need it. But what folders each add-on maps and where can vary since they each have their own setup for their docker container.
In this case that’s where you’re getting stuck. I’m poking around in the filesystem that HA sees now and there’s no /backup folder. In fact I can’t find it anywhere so I’m not sure HA has any access to it.
I’d recommend testing your scripts while running in HA’s docker container. I actually wrote a guide on how to get a command line sensor or shell command that uses ssh in HA working here. If you scroll down that to the section for ‘Testing your command’ it walks through how to do that.
Actually after I said that I remembered there is an option here. So what you can do is actually ssh into the ssh add-on from your shell_command and then execute the script from there since that can see the backup folder. It’s really wonky I know but it works, I have a shell_command which does exactly that to backup my Nginx Proxy Manager config since it stores its data in the MariaDB addon.
In your case you would basically just modify your script to do this:
ssh -o UserKnownHostsFile=<your known hosts file> <username for ssh addon>@localhost -i <your key file> '/bin/bash ./shell_scripts/send_bkp_to_nas.sh'
You’ll have to set up authorized keys and everything to get HA talking to the add-on but its pretty straightforward. And sounds like you already set up the SSH add-on to talk to your other system.
I think your approach is the right one… but so far I have issues implementing it. I have done a few tweaks and this is where I am now… I am now getting a return code: 127 …
NOTE: I know I am answering to an old post, but it is never too late to thank for something that helped, and share-back lessons learnt that can help other people looking for the same solution as I was.
This approach worked. Many thanks!
I was trying to find a way to automate old snapshots removal on HASS.io without using some shady unofficial third-party add-on stores. This helped me to successfully achieve that intent.
Here’s what I did (I used portainer, but you can ssh into the container name using SSH and Web Terminal add-on for this):
SSH into homeassistant container or connect to it using portainer (/bin/ash);
generated SSH ID using ssh-keygen;
under /config created shell_scripts folder (change the name to something that makes more sense to you), created ssh_keys subfolder and copied /root/.ssh/id_rsa to this folder (mind the permissions! use cp -p to preserve the original permissions and avoid security risks!)
tried to ssh into SSH & Web Terminal using basic command (ssh admin@a0d7b954_ssh), added the host to the known hosts when asked by the system, but skipped the login
copied /root/.ssh/known_hosts to /config/shell_scripts/ssh_keys/ (mind the permissions! use cp -p to preserve the original permissions and avoid security risks!)
Under /config/shell_scripts created a test .sh file with 755 permissions with a simple line to list all the backups older than 7 days (this will just list the files:
#!/bin/bash
find /backup/* -mtime +7 -exec ls -l {} \;
printed the output of homeassistant container /root/.ssh/id_rsa.pub (using cat or any other way you would like), and copied the content into SSH & Web Terminal container .ssh folder (/root/.ssh/authorized_keys) to make sure it is possible to perform key-based ssh login from homeassistant container into SSH & Web Terminal container
Executed the following command inside the homeassistant container to test that all is working as it should:
Why all this is needed?
Because homeassistant container does not map /backup folder by default. The above method uses SSH & Web Terminal add-on as gateway to achieve the desired result, as it has /backup folder mapped.
Hope this helps the home assistant community!
Cheers
I’m trying to do the same thing, can anyone help me?
I don’t understand almost anything about linux, so I did a search and ran the following commands in putty: