I managed to mount the SD card in another Linux machine I have and was able to navigate to the /etc folder, change the permissions of the file and delete it. I then copied the sudoers file from the Linux machine to the Pi SD card and thought I would just change the permissions once I had the Pi booted back up. Well, that isn’t working:
sudo visudo -f sudoers
sudo: /etc/sudoers is owned by uid 1000, should be 0
sudo: no valid sudoers sources found, quitting
sudo: unable to initialize policy plugin
It is an absolute shame that it is this difficult to do such a simple task as run a shell script. When I first read the documentation about being able to run a shell script I thought good God, they thought of absolutely everything. That part hasn’t changed, but apparently the idea of being able to run a script did not extend to making it relatively easy to accomplish.
I try my best to do my due diligence to show that I am not just wanting someone to do it for me, but the documentation keeps biting me because the examples tend to leave you hanging. I know this particular task is being done by others in HA, so I know it’s possible, but apparently I am not bright enough to figure it out. And for such a beautiful, wonderfully powerful home automation system, but the average user just isn’t going to invest this sort of time in trying to figure things out. I have been trying to figure this out a couple days before I made my post here. That’s just an unreasonable amount of time to have to invest to run a simple shell script.
Okay, I am back to normal, I think, well at least back to where I started. I had to run chmod --reference systemfile myfile and then chown – reference systemfile myfile
Basically, I mounted the Pi SD card in another Linux machine changed the permissions of the /etc/sudoers file so I could delete it. Then I copied the sudoers file from the Linux machine over to the /etc directory. Then I ran the two above-mentioned commands so that the ownership particularly and permissions matched that of another root owned file in that directory. Took the SD card and plugged it back into the Pi and rebooted.
So I took a long trip around OZ to get back to the point I was, which is a shell script file that won’t run.
@Tinkerer I agree, there are security issues, but how does one go about running a command that would require elevated privileges, such as the dd command, without an elevated user?
@StormStrikes what you would do is create a script owned by root, in your case this rsync script, then you add a sudoers line like this:
user hass = (root) NOPASSWD: /foo/bar/rsyncscript
This way the hass user can call sudo /foo/bar/rsyncscript and NOTHING ELSE. Since the script is owned by root, the root user is the only one who can modify the script (make sure it’s permissions are 755 and you’ll be fine).
FYI, don’t edit sudoers manually (or copying from another host like you did, since sudo (and ssh) are very particular about permissions to avoid potential security issues. Use visudo which does error checking before saving, otherwise you could end up with a broken file and unable to use sudo. It’s also a good idea to have a root password so that you can login without having to do annoying recovery steps because you broke sudoers.
@justin8 Just a couple of things, because you have given some information that I have not seen yet.
First, add the line to sudoers just like you posted user hass = (root) NOPASSWD: /path/to/my/rsyncscript ?
I ask because I have not seen it formatted in the way you provided and I just want to make sure.
Lastly, I have been editing sudoers with visudo, however, I ‘think’ I somehow wiped out the contents of the file as when I finally got it open, there were only two curly braces in it. No content, no settings, nothing. Not sure how I did that, but I did see where I needed to use visudo to edit it and was doing so. But in my inexperience with such things I somehow goofed it up. Thankfully I had other machines with Linux on them.
Again, thank you for your post.
EDIT:
One last thing if I may impose upon you. I have the other script file to run as well the one that will us dd to make a backup image of the SD card.
Do I just add another line to the sudoers file for that, or do you somehow combine them in one line?
Defaults env_reset
Defaults mail_badpass
Defaults secure_path="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sna$
# Host alias specification
# User alias specification
# Cmnd alias specification
# User privilege specification
root ALL=(ALL:ALL) ALL
user hass = (root) NOPASSWD: /usr/bin/harpi3rsync.sh
# Members of the admin group may gain root privileges
%admin ALL=(ALL) ALL
# Allow members of group sudo to execute any command
%sudo ALL=(ALL:ALL) ALL
# See sudoers(5) for more information on "#include" directives:
#includedir /etc/sudoers.d
Here are the two script files:
$ ls -l /usr/bin/harpi3rsync.sh
-rwxr-xr-x 1 root root 176 Jan 9 17:49 /usr/bin/harpi3rsync.sh
Which contains:
#!/bin/bash
# Sync/Backup the Home Assistant Config directory
rsync -azh -e ssh --delete /home/hass/.homeassistant/ [email protected]:/mnt/usb_1/FileSync/AllFiles/HomeAssistant/
And
$ ls -l /usr/bin/harpi3ddimg.sh
-rwxr-xr-x 1 root root 195 Jan 9 06:37 /usr/bin/harpi3ddimg.sh
Which contains:
#!/bin/bash
# Make a bit-for-bit image of the RaspberryPi hosting Home Assistant
ssh [email protected] dd if=/dev/mmcblk0 of=/mnt/usb_1/FileSync/AllFiles/HARPi3_SD_Backup_$(date +%Y%m%d).img bs=1M
And yet they are not running when called from within HA. I don’t know what I have done wrong but I apparently suck like an industrial strength Hoover Vacuum cleaner at running shell script from within HA.
So, first thing; you can specify multiple commands to allow on a single line, or you can make a duplicate, both should work fine.
The first error: in the HA config file you need to call sudo /usr/bin/harpi3rsync.sh not just /usr/bin/harpi3rsync.sh or it runs as hass user not root
Secondly, lets take HA out of the equation; we know it works when you run an echo script, so that is fine, it just makes troubleshooting harder. change to your hass user, and run your script, make sure it does what you want before bringing HA back in to the picture.
as the hass user, do you get an error when you run sudo /usr/bin/harpi3rsync.sh
@justin8 I appreciate your help but more importantly your patience. Thank you. This has just been one heck of a frustrating issue all around.
Now, that said, I think I found the issue, though not sure what to do about it. I changed to the hass user:
$ sudo su -s /bin/bash hass
It then is asking for a password:
hass@HARPi3:/usr/bin$ sudo /usr/bin/harpi3rsync.sh
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for hass:
Which I do not know. So I tried to run it directly and it asks me the password for the other Pi, which I do know and then I got this:
hass@HARPi3:/usr/bin$ ./harpi3rsync.sh
[email protected]'s password:
rsync: send_files failed to open "/home/hass/.homeassistant/shell_commands/harpi3ddimg.sh": Permission denied (13)
rsync: send_files failed to open "/home/hass/.homeassistant/shell_commands/harpi3rsync.sh": Permission denied (13)
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1183) [sender=3.1.1]
So, what is the lesson here? I think the lesson is this thing hates me! LOL.
Seriously, though, I think the better course of action may be to just freakin’ stick a USB stick in the Pi running HA and call it good, well I hope anyway. I can’t wait to see what challenges stem from that. One would hope it would be an easier thing, but I am learning not to expect that.
To that end, let me try that and see if I can get THAT to work and then if you think there is something else @justin8 that I can do to get this method working I will continue to troubleshoot that. However, I have taken enough time from people on here on this issue and in many ways would rather just get this done so I have the experience and lesson from it so that perhaps I can do as you and try helping others.
So if it’s asking you for a password when using sudo, there is something wrong with the sudoers config line.
The reason it had errors reading the files is because the second time you ran as the hass user, which I assume doesn’t have permission to access some files it doesn’t own, and it asked for the password because you don’t have the ssh key in the hass account, but root.
I’m at work and don’t have time right now to look up the correct sudoers syntax, but that is where your current issue is.
Okay, well I can’t express just how grateful I am for your help, it is very much appreciated. The only reason I wanted to send it over to the other Pi that serves as a file server, is because it has a 120Gb SSD attached to it so it has room for the SD card images.
And you are correct about the SSH keys. I was not really thinking of running shell commands in HA when I set that up. Perhaps a bit short sighted of me.
Quick question, is there any reason you have/want to run this from HA. Given that you’re trying to run them as root, simply calling them from cron (scheduled commands) might be easier. Alternatively, the way I do it, is to run rsnapshot on the remote host and pull the backups.
As for your sudoers entry, the problem is that you use user hass=(root) when you should just have hass ALL=(root). The first field is the username, the second is the hostname the entry applies to.
I can’t say that there is a particular, critical reason I want to do so, other than the fact the SD card images are going to be large and so I wanted it as an on-demand thing so I could make sure that there was enough space to do the backup.
Additionally, I did not want the configuration directory on a scheduled backup so that I could run that on demand as well. The thinking being that if I goofed up my configuration somehow or an automation, or just could not figure out what I did wrong and wanted to go back to a known good configuration, then I could restore from the backup. However, if it’s on a schedule and it happens to run a backup and I need the last known good configuration, then I am out of luck because it just got updated.
It’s just convenient to have it in HA because if I know I am getting ready to make a lot of changes or want to try a few things out, experiment, etc, I can just click the link, back up the configuration and the image and then play to my hearts content. Yes, you can do that from the command line or by running a shell script, but I saw others doing very similar things within HA and it seemed a good way to proceed.
For versioning, look at rsnapshot. Depending on how you configure the retention you can then easily recover the working configuration from last Tuesday, or August, or… I’d never recommend a backup solution that over-wrote the previous backup - if that fails during a backup you’re left with nothing after all
With that said, as justin8 said, put both lines into a single shell script and call that with sudo. You’ll likely need to also need to explicitly specify the SSH keys.
I’d be tempted to suggest too that the second line (ssh root@...)is a script on the remote host, since that’s where it’s running. Then you can run the rsync as the hass user, which then runs sudo on the remote host for the dd:
On HA, create a new SSH keypair with no passphrase:
ssh-keygen -t ed25519
Copy the contents of ~hass/.ssh/id_ed25519.pub to ~hass/.ssh/authorized_keys on 10.0.0.20.
@Tinkerer thank you and @justin8 for all of the excellent information that you have provided. Simply amazing. Just when I think I am comfortable with most tasks in Linux, I come across something like this that reminds me that my knowledge is just a molecule in an ocean of elements.
That said, I take it then, that I am perhaps going about this perhaps either the long way or the wrong way. Oh, and just to clarify, I would not have over-written the dd images as they were dated, but I can see your point with the rsync’ng of the configuration files.
I have never used any kind of versioning before, well other than just some kind of basic date separation, so I am in new territory here. My thinking is that my drive to make dd images is a bit much and I would probably be better off just setting up a new RPi SD card complete with HA installed and keep it in a safe place and then if needed just copy the configuration directory over. Perhaps manually make new images off the existing run if any major changes are made in the software or something like that. Is that perhaps a better way?
Then use the rsnapshot to keep the backups of the configuration directory since that is a much smaller footprint in comparison and easier to maintain. I just want to get a solid backup strategy in place so that I can restore this thing pretty quickly if needed.
EDIT:
With respect to the SSH keys, do those need to be created as the hass user?
There’s 2 things I’ve learned in all the time I’ve been using computers (which is more than a couple of decades):
There’s always more to learn
However many ways you think there are of doing anything, like backups, somebody else will eventually find another option that’s equally valid
There are 2 things to do for backing up (and recovering) your HA system, IMO:
Keep a bootable backup of the HA install to enable a quick recovery
Back up the configuration file(s) regularly so you can recover from mistakes, even if it takes a week, or longer, to notice them
The way I’m tackling it is:
For the first, I’ve started with rpi-clone. It isn’t perfect, for my purposes, but I can easily make the changes I want so that it operates without intervention. It uses dd to create an initial image, then future runs use rsync to copy only the changes. I’ll use that periodically so I’ve got an “instant recovery” option, probably weekly while I’m evolving my configuration rapidly, dropping off to monthly or so eventually. I’m using a USB micro SD adapter for this purposes, with a second SD card. I will add a second pair of these later, so that the backups can alternate. That way if things become corrupted during a backup I don’t lose it all.
I already use rsnapshot on my network, with another Pi 3 acting as my backup server. I’ve simply added my HA system to the configuration there. I can apply those backups to any freshly built Raspbian Lite install to recover with a little effort. This ensures that I’m backing up my configuration file every 3 hours, and I’ll build up a rolling history of my changes that expire the way I’ve configured. Primarily I expect to use this to recovery from mistakes editing the configuration files.
As I switched from sqlite to MySQL (to see if that would help performance - it has a little) I’m using Percona Xtrabackup to back up my MySQL database. That drops the backups in a location that’s backed up by rsnapshot, and by my cloud backup scripts (below).
Since I’m using the B2 cloud backup service, I’ve installed rclone and configured it to back up any changes to my HA install hourly. That also supports versions, so I’ll be able to recover my configuration from any hourly backup - currently I’ve not sent any expiry so they’ll be around “forever”. This I’m using so that if the worst happens and my computers are lost (fire/theft/whatever) I can still recover everything that matters - adding my HA config is a trivial overhead to what I’m already backing up.
Yes, SSH keys should always be created as the user that’ll be using them. You don’t strictly need to, but it ensures that the files are in the right location on your SSH client (which in this case is your HA server).
Thank you, sir, again, I appreciate the time you are investing and the patience you have demonstrated. I wish this could be stickied as it is chock full of good information.
You have given me a ton to work with. My hope is to at some point have a rack mountable server. Hopefully with the space to mount my Cisco gear and some kind of UPS capacity. I am going to be adding some IP cameras so I need PoE as well.
So while all of this is still in its smaller scale, I’m trying to plan on how to recover if needed. I think before I get too much deeper into this I will at least spend some time learning rsnapshot. It seems simple enough to implement so I can at least start there.
Just for completeness’s sake I’ll post my backup strategy:
I use crashplan to backup to the cloud, I click to add the directory and then it’s versioned and keeps 30 days worth of historical changes and the latest version forever.
This way all I have is my configuration of home-assistant/ha-bridge and the configuration of the docker stuff (which is on github and versioned). Restoring is simple. even if all my stuff fails, a clean OS install, tell crashplan to restore from an existing system and choose the old one, and click restore on the files and I’m back.
I want to unmount my Hard Drive (which is connected to my Raspberry Pi) with a simple shell_command through Home Assistant:
umount /dev/sda1
It works perfectly if I do it through terminal and it also works perfectly if I change to virtual enviroment in HASS with source /srv/hass/hass_venv/bin/activate and run the same command there.
It doesn’t work through script (unmount), it doesn’t work in single or double quotes, with eject or umount. There is no error in log in HASS, nothing happens.
Any help would be really appreciated, I don’t know what I am missing.