Snapshot Service enhancement to USB-dongle

Hassio has a snapshot feature, saving the snapshot to the SD-card (Raspberry Pi). If the SD-card becomes corrupted the snapshot is also likely gone.

I would like to suggest that Hassio is enhanced with some options for the snapshot feature:

  1. Make it possible to select default folder on system SD-card/USB-dongle/network drive.

  2. Options for when and how often to take snapshot (once a day/week/months)

  3. Name the snapshot static. -Meaning overwrite the old snapshot, with the new one (saving space) or dynamic file name (keeping a list of snapshots)

  4. Option for notify if the snapshot process fails. And option for when the notify is dispatch (backup often in the night, messages in the morning).

Please vote if this is something you want for your installation.

The daily snapshot automation in this thread works fine. But does not have the options.

Meanwhile trying to make something up:

My system looks like this:

  • Hassio 0.63.2
  • RPI3
  • USB Memory stick
  • USB Z-Wave Aeon Labs Z-Stick Gen5/
  • USB Zigbee ConBee-deCONZ

I am trying to write an automation action which renames (fixed name) and moves the snapshot to the USB stick.

Here is how Hassio looks running " hassio host hardware" before and after the USB-stick.
DiskDeltajpg

  1. Which one is the system SD-card, and the USB-stick, and what is the file path to the system “backup” folder?

  2. Anyone knows how to code an automation with action command line (Python?) or a custom_components Python script doing this?

Please advise.

@Tomahawk, here’s a POC of an addon that does some of what you want. Every time you start the addon, it’ll automatically copy all backups to /dev/sda1, then apply the retention policy, then stop. You can automatically start the addon with hassio.addon_start service via an automation.

To install it, create a folder named \\hassio.local\addons\copy_backups_to_sda1 then place all 3 of these files in that folder. Refresh your addons and install it.

I don’t plan on continuing development on this. If you decide to improve on it, let me know how things progress.

\\hassio.local\addons\copy_backups_to_sda1\config.json
{
  "name": "Copy Backups to sda1",
  "version": "0.0.1",
  "slug": "copy_backups_to_sda1",
  "description": "Copies backups from /backup to /dev/sda1",
  "startup": "once",
  "boot": "manual",
  "privileged": [
    "SYS_ADMIN"
  ],
  "hassio_api": true,
  "map": [
    "backup:rw"
  ],
  "devices": [
    "/dev/sda1:/dev/sda1:rwm"
  ],
  "options": {
    "max_local_backups": 2,
    "max_remote_backups": 7,
    "remote_backup_path": "backups"
  },
  "schema": {
    "max_local_backups": "int",
    "max_remote_backups": "int",
    "remote_backup_path": "str"
  }
}
\\hassio.local\addons\copy_backups_to_sda1\Dockerfile
ARG BUILD_FROM
FROM $BUILD_FROM

ENV LANG C.UTF-8

RUN apk add --no-cache rsync jq

COPY run.sh /
RUN chmod a+x /run.sh
CMD [ "/run.sh" ]
\\hassio.local\addons\copy_backups_to_sda1\run.sh
#!/bin/bash
CONFIG_PATH=/data/options.json

MAX_LOCAL=$(jq --raw-output ".max_local_backups" $CONFIG_PATH)
MAX_LOCAL_PLUS1=$(($MAX_LOCAL + 1))
MAX_REMOTE=$(jq --raw-output ".max_remote_backups" $CONFIG_PATH)
MAX_REMOTE_PLUS1=$(($MAX_REMOTE + 1))
BACKUP_PATH=$(jq --raw-output ".remote_backup_path" $CONFIG_PATH)

echo "[INFO] MAX_LOCAL=$MAX_LOCAL"
echo "[INFO] MAX_REMOTE=$MAX_REMOTE"
echo "[INFO] BACKUP_PATH=$BACKUP_PATH"

echo "[INFO] Mounting /dev/sda1 to /mnt"
mount /dev/sda1 /mnt

# Create /mnt/$BACKUP_PATH folder
if [ ! -d /mnt/$BACKUP_PATH ]; then
  echo "[INFO] Creating /mnt/$BACKUP_PATH"
  mkdir -p /mnt/$BACKUP_PATH
fi

echo "[INFO] Copying all backups from /backup to /mnt/$BACKUP_PATH"
rsync -v /backup/*.tar /mnt/$BACKUP_PATH

echo "[INFO] Purging backups from /backup except for the last $MAX_LOCAL"
ls -tp /backup/*.tar | grep -v '/$' | tail -n +$MAX_LOCAL_PLUS1 | xargs -I {} rm -- "{}"

echo "[INFO] Purging backups from /mnt/$BACKUP_PATH except for the last $MAX_REMOTE"
ls -tp /mnt/$BACKUP_PATH/*.tar | grep -v '/$' | tail -n +$MAX_REMOTE_PLUS1 | xargs -I {} rm -- "{}"

@NotoriousBDG NotoriousBDG Much appreciated!

I don’t have the knowledge to further develop this.
Unfortunately, I can’t get your addon to work.
Getting this error: “Error on Hass.io API: Addon not exists”

addon

I have tried:
addon: copy_backups_to_sda1
addon: ‘copy_backups_to_sda1’
addon: [‘copy_backups_to_sda1’]
addon: [COPY_BACKUPS_TO_SDA1]

Should this work as is?

And is it possible to add a default password to the service: hassio.snapshot_full?

(Hassio 0.66.1)

  action:
  - service: hassio.snapshot_full
    data_template:
      name: Automated Backup {{ now().strftime('%Y-%m-%d') }}
  - delay: '00:05:00'
  - service: hassio.addon_start
    data:
      addon: copy_backups_to_sda1
  - service: notify.me
    data_template:
      title: 'HA - Information'
      message: "Created backup named, Automated Backup {{ now().strftime('%Y-%m-%d') }}"

The slug name you should use is in the url when you’re on the addon details page. For example, if the url is http://hassio.local:8123/hassio/addon/local_copy_backups_to_sda1, you’d use local_copy_backups_to_sda1 when you call the service.

It looks like the API supports it, but I haven’t used it myself. Try adding password: yourpassword to the snapshot call like this:

  - service: hassio.snapshot_full
    data_template:
      name: Automated Backup {{ now().strftime('%Y-%m-%d') }}
      password: 'yourpassword'

I believe this should work:

  action:
  - service: hassio.snapshot_full
    data_template:
      name: Automated Backup {{ now().strftime('%Y-%m-%d') }}
      password: `yourpassword`
  - delay: '00:05:00'
  - service: hassio.addon_start
    data:
      addon: local_copy_backups_to_sda1
  - service: notify.me
    data_template:
      title: 'HA - Information'
      message: "Created backup named, Automated Backup {{ now().strftime('%Y-%m-%d') }}"

If I connect via SSH to Hassio and I type the mount command this is the result:

> core-ssh:~# mount /dev/sda1 /mnt
> mount: mounting /dev/sda1 on /mnt failed: No such file or directory

This is the hardware info:

core-ssh:~# hassio hardware info
{
    "result": "ok",
    "data": {
        "serial": [
            "/dev/ttyACM0",
            "/dev/ttyAMA0"
        ],
        "input": [],
        "disk": [
            "/dev/sda",
            "/dev/sda1"
        ],
        "gpio": [
            "gpiochip128",
            "gpiochip0"
        ],
        "audio": {
            "0": {
                "name": "bcm2835_alsa - bcm2835 ALSA",
                "type": "ALSA",
                "devices": [
                    {
                        "chan_id": "0",
                        "chan_type": "digital audio playback"
                    },
                    {
                        "chan_id": "1",
                        "chan_type": "digital audio playback"
                    }
                ]
            }
        }
    }
}

Why this??

No one have this problem?

Dear NotoriousBDG, can you help me with this addon?

Sorry, I don’t use hassio anymore. I can’t really offer any help beyond what I posted above when I was using it.

Thanknyou for your reply, can I ask you why you leav ha? You choose another system or other peesonal reason?

I didn’t leave Home Assistant. I’ve just moved to Home Assistant running in native Docker.

I’m having this problem as well. this used to work for me when I had hass.io running on ResinOS (somewhere around version ~0.71 – I moved to a VM for a while) but not sure if it is supported with HassOS now that I’m back on the Pi.

I’m open to suggestions to get this to work, but I am exploring other options (since this addon is not being supported).

1. backing up directly to my NAS with the Remote Backup SCP addon: https://github.com/mr-bjerre/hassio-remote-backup

2. the Google Drive backup addon: https://github.com/samccauley/addon-hassiogooglebackup#readme

Hi, have you tested the second option you have mentioned, using Google Drive?
Thanks.

I mostly use the Dropbox snapshot backup these days to get the snapshot off my host machine.

Also I use hassio on openmediavault so the snapshots are stored on an SMB shared folder that’s outside of the docker container, so I’m not as worried about losing that data (as opposed to a SD card on a Raspberry Pi).

ok, thank you.

Hi Guys,
We really need a good official solution to get the backups automated AND automatically off of the device.

I know this is a can of worms with the options for ‘destinations’ but something really does needs to be done. I think a good clean start might be to NAS or other network share requiring ip/dns name, u/n and p/w easily configurable from the ui

I strongly feel this is more than a feature request, it is more of a requirement of any modern device or appliance with complex configs and custom scripts, automations and logging history.

There are many discussions on this topic in various threads all with unsupported and or incomplete examples, but I would only trust an official route on this important and necessary issue.

Many thanks guys/gals keep up the good work :wink:

Since some month I’m using this SAMBA add-on:

It works great, but is complicated for users without network knowledge. I wonder why this feature request does not have more votes? It would be such a simple back-up possibility for everyone, especially with regards to “streamlining”…