Shell command backup not found


This is my first post!

I wrote a script that will backup my snapshots on my Synology via scp Manually, via the terminal, the script is working fine.

However, it seems like from an automation, the script is unable to access the /backup directory? How can I make this directory visible to my scripts?

Thank you!!

Note: I installed Home Assistant on a Pi 4 and the Hassos image.

Did you try listing it in allowlist_external_dirs? I haven’t really tried to send files with a shell_command so idk if that option impacts it but it does say it whitelists folders you intend to send files from via integrations.

Also, not sure what your script/setup looks like but other checklist items to be aware of that create challenges with shell_commands:

  1. Can’t expand ~ so make sure you are listing the full path to any files you access
  2. You cannot sequence multiple commands using piping characters, it won’t work in a shell command

Also specifically for shell commands which ssh or scp - don’t store any files you need in /root/.ssh since the home directory does not persist over updates, anything in there will be wiped out. That includes your generated key files and the known hosts file.

Thank you for your answer! Yes, I got a lot of issues while trying to make it run. I found out that I had to move my keys away from my root dir as you mentioned also.

My shell command looks like this:

  bkp_snap: /bin/bash ./shell_scripts/

Now I tried what you mentioned with the allowlist_external_dirs parameter. The issue is when I add this to my configuration.yaml:

    - /backup

when I validate my config, I get this error:

  Not a directory @ data['allowlist_external_dirs'][0]

Also from the terminal the /backup directory is perfectly accessible… but not from the script…

Ok so key thing to note here, you are running your script from the SSH add-on. That’s actually a completely different docker container then what HA runs in. Which means when HA runs your shell command it sees a totally different file system then what you see.

Now obviously there’s a lot of overlap since the same files are remapped in a lot of the containers. Like clearly both of them can see /config and the share and ssl folders are available to all addons that need it. But what folders each add-on maps and where can vary since they each have their own setup for their docker container.

In this case that’s where you’re getting stuck. I’m poking around in the filesystem that HA sees now and there’s no /backup folder. In fact I can’t find it anywhere so I’m not sure HA has any access to it.

I’d recommend testing your scripts while running in HA’s docker container. I actually wrote a guide on how to get a command line sensor or shell command that uses ssh in HA working here. If you scroll down that to the section for ‘Testing your command’ it walks through how to do that.

Actually after I said that I remembered there is an option here. So what you can do is actually ssh into the ssh add-on from your shell_command and then execute the script from there since that can see the backup folder. It’s really wonky I know but it works, I have a shell_command which does exactly that to backup my Nginx Proxy Manager config since it stores its data in the MariaDB addon.

In your case you would basically just modify your script to do this:

ssh -o UserKnownHostsFile=<your known hosts file> <username for ssh addon>@localhost -i <your key file> '/bin/bash ./shell_scripts/'

You’ll have to set up authorized keys and everything to get HA talking to the add-on but its pretty straightforward. And sounds like you already set up the SSH add-on to talk to your other system.

1 Like

I will try that tonight! It makes sense!

I think your approach is the right one… but so far I have issues implementing it. I have done a few tweaks and this is where I am now… I am now getting a return code: 127

Error running command: `/usr/bin/ssh -o UserKnownHostsFile=shell_scripts/ssh_keys/known_hosts root@localhost -i shell_scripts/ssh_keys/id_rsa '/bin/bash ./shell_scripts/'`, return code: 127

This exact command works fine from the terminal.


So I was finally able to make it work, by wrapping the whole ssh command into a wrapper, like this:

  bkp_snap: /usr/bin/ssh -o UserKnownHostsFile=shell_scripts/ssh_keys/known_hosts root@localhost -i ssh_keys/id_rsa /bin/bash ./shell_scripts/
  bkp_wrapper: /bin/bash ./shell_scripts/

Now I am using the wrapper and it works. I will continue to tweak it and look at it, but thanks to your hint, I now have something that works!!


NOTE: I know I am answering to an old post, but it is never too late to thank for something that helped, and share-back lessons learnt that can help other people looking for the same solution as I was.

This approach worked. Many thanks!

I was trying to find a way to automate old snapshots removal on without using some shady unofficial third-party add-on stores. This helped me to successfully achieve that intent.

Here’s what I did (I used portainer, but you can ssh into the container name using SSH and Web Terminal add-on for this):

  1. SSH into homeassistant container or connect to it using portainer (/bin/ash);
  2. generated SSH ID using ssh-keygen;
  3. under /config created shell_scripts folder (change the name to something that makes more sense to you), created ssh_keys subfolder and copied /root/.ssh/id_rsa to this folder (mind the permissions! use cp -p to preserve the original permissions and avoid security risks!)
  4. tried to ssh into SSH & Web Terminal using basic command (ssh admin@a0d7b954_ssh), added the host to the known hosts when asked by the system, but skipped the login
  5. copied /root/.ssh/known_hosts to /config/shell_scripts/ssh_keys/ (mind the permissions! use cp -p to preserve the original permissions and avoid security risks!)
  6. Under /config/shell_scripts created a test .sh file with 755 permissions with a simple line to list all the backups older than 7 days (this will just list the files:
find /backup/* -mtime +7 -exec ls -l {} \;
  1. printed the output of homeassistant container /root/.ssh/ (using cat or any other way you would like), and copied the content into SSH & Web Terminal container .ssh folder (/root/.ssh/authorized_keys) to make sure it is possible to perform key-based ssh login from homeassistant container into SSH & Web Terminal container
  2. Executed the following command inside the homeassistant container to test that all is working as it should:

/usr/bin/ssh -o UserKnownHostsFile=shell_scripts/ssh_keys/known_hosts root@localhost -i shell_scripts/ssh_keys/id_rsa /bin/bash /config/shell_scripts/

  1. If all is working, you can prepare your shell_command yaml entries, and create an automation that calls this shell_command, and you are good to go.

If this helps other people seeking for an automated snapshot removal on hassio, here is my sh file that removes snapshots older than 13 days:

find /backup/* -mtime +13 -exec rm {} \;

Why all this is needed?
Because homeassistant container does not map /backup folder by default. The above method uses SSH & Web Terminal add-on as gateway to achieve the desired result, as it has /backup folder mapped.

Hope this helps the home assistant community!

I’m trying to do the same thing, can anyone help me?
I don’t understand almost anything about linux, so I did a search and ran the following commands in putty:

mkdir ~/config/shell_scripts
mkdir ~/config/shell_scripts/ssh_keys
cp -p ~/.ssh/ ~/config/shell_scripts/ssh_keys/id_rsa
cp -p ~/.ssh/known_hosts ~/config/shell_scripts/ssh_keys/known_hosts
cp -p ~/.ssh/ ~/.ssh/authorized_keys

I created the file inside the /config/shell_scripts folder

find /backup/* -mtime +7 -exec ls -l {} \;

And I created the shell_command:

  bkp_snap: /usr/bin/ssh -o UserKnownHostsFile=shell_scripts/ssh_keys/known_hosts root@localhost -i shell_scripts/ssh_keys/id_rsa /bin/bash /config/shell_scripts/

When I run the shell command on HA I get the message below:

stdout: ""
stderr: Host key verification failed.
returncode: 255