Thanks. There’s a bit of an issue with the file /.storage/auth, which is owned by root, so my regular uid isn’t able to add it to the archive. Everything else zips up nicely.
I just run a chown on my docker config directory to switch it away from root to my current user (finity) and it all still works fine. And I can back everything up including that auth file.
So the way I backup seems overly involved, but it works.
1- Use Portainer to stop the Home Assistant container - if Home Assitant is running I’ve found files skip and don’t copy over
2- Sudo nautilus which pulls up the root file explorer and gets around the permission issues.
3- Copy the entire config directory to my NAS drive and backup folder
4- Start the container again in portainer after copying is complete
I mean, this works, but I really would like to automate the process if possible. I just haven’t figured out a good way to do that though.
Update - I think I figured out a way to automate. I wrote a script to backup and run it nightly through the crontab. Generally following the directions here - Using rsync and cron to automate incremental backups
First I created a script file called hass_backup.sh that stops the container, does the backup using rsync, then starts the home assistant docker container back up
docker stop homeassistant
rsync -ab --backup-dir=old_`date +%F` --delete --exclude=old_* /home/mwav3/homeassistant /media/mwav3/MyPassport/backups/hassrsync
docker start homeassistant
To make sure that all the root files copy, I did this in the global crontab by running sudo gedit /etc/crontab
I then added commands to run the script nightly at 2 AM as root. I also then added another command to the crontab to start the container at 2:15 AM in case anything happens where the first script fails at some point so Home Assistant doesn’t stay offline.
0 2 * * * root sh /home/mwav3/hass_backup.sh
15 2 * * * mwav3 docker start homeassistant
Seems to be working. I’m sure there’s lots of other ways to do this.
I just switched from a supervised Pi4 setup to running on a laptop with Ubuntu and docker and faced the same questions. After some research (for those who also found this link as I did), I settled on Duplicati:
https://hub.docker.com/r/linuxserver/duplicati
Runs in docker, and as long as you provide it your docker root folder as a volume, it can back up all your containers.
(supports GDrive, OneDrive, local (NFS), and tons more destinations)
The latest version 2022.4 now allows you to manually create backups through the UI for home assistant container pretty easily.
But I would still recomend an automated backup like the cron rsync example I posted above or using Duplicati as the last poster suggested. I saw this helpful video guide to setup duplicati recently as well if you’re looking for step by step instructions.
Duplicati seems like a perfect solution. However as stated in the video, if HA is running, database files backup will fail. He told that it’s not a probem for him, as he only want’s to backup configuration files, but for me it would be important to backup db as well. I had an incident of deleting (yes, stupid me) my whole HA docker folder and lost about half a year of data (luckily had half year old full backup).
Does anybody have a idea how to solve this issue and use duplicati to backup the whole docker folder with db-s included?
I think the key is to stop the home assistant container prior to backing up. I’m still using the shell script I posted a few posts up for backups What backup strategy when running Home Assistant in Docker? - #10 by mwav3
The shell script stops the Home Assistant container every night at 2AM. It then uses Rsync to copy the home assistant config files over to another backup folder (fully in tack because Home Assistant is stopped). After the backup the shell script starts Home Assistant again. I then use the built in backup program in Ubuntu (which I believe is just duplicity) to copy the backup from that other folder to my Google Drive automatically. Duplicati could also be used to copy the backup to create a “backup of the backup”.
I think the problem with just using Duplicati alone is you can’t use it to stop Home Assistant (at least no way I am aware of)
I’ve had issues restoring Home Assistant container using Duplicacy. I think it’s a permissions issue in respect of the /.storage/auth file (which I can see is referred to above), either the backing up of this and/or the restoring of it. I’m running Unraid. Does anyone know what I should do to avoid any permissions issue? Thanks.
Hi,
Sorry to reopen this thread but did anyone find a solution to the permission issues when doing the backup with Duplicati?
Many thanks
It’s been a while, so let me add to this for anyone else finding this now.
I have moved my DB to MariaDB. I wanted to be able to show 45 days of recorder data, and my DB is now 65GB (As opposed to my InfluxDB with over a year’s worth of data only being 11GB). But even with the built-in DB, there are options.
I use GitHub - tiredofit/docker-db-backup: Backup multiple database types on a scheduled basis with many customizable options. This can backup many different databases. And I have an NFS mount to my NAS where it puts the database backups.
This way backups get done through SQL and not file-level.
It supports SQLite (which I think HA uses?), so it’s just a matter of mounting the db file to this container and it should be able to back that up properly.
As for the mentioned permissions issues, you can either run duplicati as root, and there shouldn’t be any issues, or make sure all containers are running as the correct PUID/GUID.
I’m surprised to how difficult it is to find a best-practice for this.
Here’s what I’ve done:
- Create an automation in HA to make a backup:
alias: "Backup: Create backup"
description: ""
trigger:
- platform: time
at: "03:00:00"
condition:
- condition: time
weekday:
- mon
- wed
- sat
action:
- service: backup.create
data: {}
mode: single
-
Install Rclone on the host machine: https://rclone.org/
-
Create a script on the host machine
#!/bin/bash
# Backup HA
echo "Backing up Home Assistant to pCloud..."
rclone sync /opt/appdata/homeassistant/backups pCloud:/homeassistant_backup/ -P
echo "Backup of Home Assistant database to pCloud completed."
# Remove oldest HA backups
echo "Removing oldest Home Assistant backups..."
ls -t /opt/appdata/homeassistant/backups | tail -n +4 | xargs -I {} rm -f "/opt/appdata/homeassistant/backups/{}"
echo "Finnished removing old Home Assistant backups."
- Setup a cron job on the host to run the script each night.
Pro: It works, and by using HA’s own backup functionality, removes the risk of database issues. Also, no need to stop and start Home Assistant.
Con: Ideally, I would have liked to do also the HA backup from the script, but I haven’t managed to figure out how to do that.
Hope this helps someone.
Does HA backup (from UI or calling service as in your case) create also backup of data ( db)?
It helps a lot.
For creating backups from cli or script maybe you can call directly to the backup API service with curl.
This is the type of question you should search before posting, as it has nothing to do with Docker. But yes, the database is definitely part of the backup.
I agree with the idea, but haven’t researched if it’s possible or even less - exactly how to do it.
unfortunately, it doesn’t seem to work here, I don’t really get why it complains about zha_gateway. And on the other hand, I migrated the database from sqlite to pgsql, which I’m already backing up. Perhaps I should do just a rclone/rsync task for the data directory.
homeassistant | 2024-01-19 00:55:00.144 ERROR (MainThread) [homeassistant.components.automation.backup_create_backup] While executing automation automation.backup_create_backup
homeassistant | Traceback (most recent call last):
homeassistant | File "/usr/src/homeassistant/homeassistant/components/automation/__init__.py", line 669, in async_trigger
homeassistant | await self.action_script.async_run(
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 1587, in async_run
homeassistant | return await asyncio.shield(run.async_run())
homeassistant | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 426, in async_run
homeassistant | await self._async_step(log_exceptions=False)
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 479, in _async_step
homeassistant | self._handle_exception(
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 502, in _handle_exception
homeassistant | raise exception
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 476, in _async_step
homeassistant | await getattr(self, handler)()
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 713, in _async_call_service_step
homeassistant | response_data = await self._async_run_long_action(
homeassistant | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/helpers/script.py", line 675, in _async_run_long_action
homeassistant | return long_task.result()
homeassistant | ^^^^^^^^^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/core.py", line 2149, in async_call
homeassistant | response_data = await coro
homeassistant | ^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/core.py", line 2186, in _execute_service
homeassistant | return await target(service_call)
homeassistant | ^^^^^^^^^^^^^^^^^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/components/backup/__init__.py", line 29, in async_handle_create_service
homeassistant | await backup_manager.generate_backup()
homeassistant | File "/usr/src/homeassistant/homeassistant/components/backup/manager.py", line 176, in generate_backup
homeassistant | raise result
homeassistant | File "/usr/src/homeassistant/homeassistant/components/zha/backup.py", line 15, in async_pre_backup
homeassistant | zha_gateway = get_zha_gateway(hass)
homeassistant | ^^^^^^^^^^^^^^^^^^^^^
homeassistant | File "/usr/src/homeassistant/homeassistant/components/zha/core/helpers.py", line 459, in get_zha_gateway
homeassistant | raise ValueError("No gateway object exists")
homeassistant | ValueError: No gateway object exists
One more strategy:
I am using git as backup. It is a bit fiddly with the permissions, but I have a script which can be run as root.I have it run by a cron job a few times a day:
#!/bin/bash
cd <path to installation>
chmod -R a+r .
sudo -i -u <username> bash << EOF
cd <path to installation>
git add .
git commit -m "[BACKUP `date +'%Y-%m-%d %T'`]"
git push
EOF
Git makes it really easy to track changes or to revert only some part of the configuration. There are also great free online services to host your repository
I am also using MariaDB and I have another machine replicate it continuously. Then on that other machine I make a backup every night with something like:
mysqldump -u <username> -p ha | lbzip2 -9czq > backup.bz2
Hi… Is there a way to add an action that wait for the backup to wait and then call the script directly from HA? I know how to call the script, but not sure on how to start it after the backup end… Maybe a waiting template?
You already use an action to create the backup. You could change the trigger (or add a new one) to be a webhook, which you can easily call from your local script with curl.
Here is what I have implemented using some information provided here:
- Trigger the backups with a cronjob executing one script:
0 4 1/10 * * cd /usr/local/home_assistant && /bin/bash /usr/local/home_assistant/generate_backup.sh >> /var/log/home_assistant_backup.log 2>&1
- I use an external postgres database to generate the backups which I have to backup separately.
- The HA config is backed up with an automation which is triggered by a webhook which is called by the script.
- In order to know that the backup is done I use the logs of HA.
- Finally I use rclone to send the backups to a local nexcloud server.
Example of the automation:
- alias: "Backup: Create backup hook"
description: ""
trigger:
- platform: webhook
allowed_methods:
- POST
local_only: true
webhook_id: backup-create-backup-<put here some random characters as pseudopassword>
condition: []
action:
- action: system_log.write
metadata: {}
data:
level: warning
message: Creating backup...
- data: {}
action: backup.create
- action: system_log.write
metadata: {}
data:
level: warning
message: >
Backup created for date: {{ as_timestamp(now()) |
timestamp_custom('%Y-%m-%d') }}
mode: single
The script that creates the backups:
#!/bin/bash
##########
# CONFIG
# This has to be configured in an automation
BACKUP_HOOK=http://localhost:<port>/api/webhook/backup-create-backup-<my_hook_secret>
# Configure the backups folder inside your config
HA_BACKUP_FOLDER=./config/backups
# Configuration of rclone
RCLONE_CONFIG_FOLDER=./rclone_config
# Confgure where to store locally the database backups
DATABASE_BACKUP_FOLDER=./backup_database
# Configure the maximum time the script waits for the backup to complete
WAITING_TIME=120
# rclone path where the backups will be sent
REMOTE_BACKUP_FOLDER=nextcloud:/00_backups/05_home_assistant
##########
# START HA BACKUP
DATE_STRING=$(date +%Y-%m-%d)
echo "Deleting old backups..."
if [ -z ${HA_BACKUP_FOLDER+x} ]; then echo "HA_BACKUP_FOLDER is not set" && exit 1; fi
rm ${HA_BACKUP_FOLDER}/*.tar
echo "Sending generate backup to backup hook..."
# Data is not relevant, just a post request
curl -vvv -d '{"data":"create_backup"}' -H "Content-Type: application/json" -X POST ${BACKUP_HOOK}
echo "Waiting for the backup to be generated..."
attempt=1
until /usr/local/bin/docker-compose logs --tail 20 | grep -q "Backup created for date: ${DATE_STRING}"; do
if [ "$attempt" -ge "$WAITING_TIME" ]; then
echo "Timeout reached after $WAITING_TIME attempts."
exit 1
fi
attempt=$((attempt + 1))
sleep 1
done
echo "Renaming generated backup..."
mv "$(ls -t ${HA_BACKUP_FOLDER}/*.tar | head -n 1)" "${HA_BACKUP_FOLDER}/ha-backup-${DATE_STRING}.tar"
echo "Sending backup to backup server..."
/usr/bin/docker run --rm \
--volume ${RCLONE_CONFIG_FOLDER}:/config/rclone \
--volume ${HA_BACKUP_FOLDER}:/data \
rclone/rclone \
copy /data ${REMOTE_BACKUP_FOLDER} \
--no-check-certificate
echo "Config backup done for ${DATE_STRING}"
##########
# START HA DATABASE BACKUP
echo Starting database backup...
mkdir -p ${DATABASE_BACKUP_FOLDER}
echo "Deleting old database backups..."
if [ -z ${HA_BACKUP_FOLDER+x} ]; then echo "HA_BACKUP_FOLDER is not set" && exit 1; fi
rm ${DATABASE_BACKUP_FOLDER}/*.sql.gz
echo "Generating database backup..."
/usr/bin/docker exec homeassistant_db /bin/bash \
-c "/usr/bin/pg_dump -U postgres homeassistant_db" |
gzip -9 >${DATABASE_BACKUP_FOLDER}/homeassistant_db-${DATE_STRING}.sql.gz
echo "Sending backup to backup server..."
/usr/bin/docker run --rm \
--volume ${RCLONE_CONFIG_FOLDER}:/config/rclone \
--volume ${DATABASE_BACKUP_FOLDER}:/data \
rclone/rclone \
copy /data ${REMOTE_BACKUP_FOLDER} \
--no-check-certificate
echo "Database backup done for ${DATE_STRING}"