Auto-delete backups

It would be nice to have a way to get old backups auto-deleted.

E.g., having an option to set that only the last x backups of one type are kept and the oldest ones are deleted automatically to keep the VM with HA small.

Since it gets auto-backuped with Proxmox every week the VM backup gets bigger and bigger without any extra use.

The Samba Backup add-on does this.

2 Likes

So does the Google drive backup add-on.

1 Like

I assume that the cleanup is only done after a backup to g-drive or smb. I would love to get the cleanup done directly after HA is doing a new backup.

1 Like

Anyhow searching HA add-ons and HACS add-ons I did not find the samba nor the google backup add-on. I might be blind?

Those are great add-ons. I’ve used them both.

But I still agree this FR would be a nice addition to the native HA backup process.

Even better would be the ability to back up directly to a Samba share like a NAS. Before anyone misunderstands, those two add-ons do not create the backup on the Samba share or G-Drive. They copy it there. It matters, if you’re storage is limited or (like mine) on an SD card.

Voted.

1 Like

With the new network storage options with HA v2023.6 this feature is more relevant than before. At least the SMB backup addon would become irrelevant and move into native HA functionality with a blueprint or two for automations.

5 Likes

I’ve been a long time user of the samba backup addon but am always a fan of native integrations. I moved over to native NFS supported backups.

Today I ran into this issue and I discovered when I went into my NFS drive, cleaning up the old backups actually fixed the issue and allowed my backups to show up in Home Assistant again.

Having this natively supported would be very helpful!

I’d certainly appreciate this as an HA container (docker) user.

+1 I prefer to use native stuff and use NFS mount as a backup folder. My nightly backup makes one local and one remote backup.

I can delete old backups easily enough locally with a shell_command but I cannot find the mount point ha uses for the NFS share so am a bit stuck doing that one programatically from ha - I have had to set something up on the nas (I use OpenMediaVault).

Google drive backup and Samba Backup are great addons. I also used the Samba backup for a long time.

With the new network storage options in HA, this feature makes more sense than before. At the very least, it would make the SMB backup addon obsolete and transition into native HA functionality with a blueprint or two for automations.

And I would definitely prefer on-board tools to additional add-ons

This is how I accomplish this functionality on a NAS share without addons

Attach a folder to the same folder as where your backups are going but call it something different, i.e. Backups is what HA attaches as a backup type network storage and the same share on the NAS is also attached as a share type called Backups_Share

add this to config.yaml

command_line:
  - switch:
      name: Backups Cleanup
      # go to backup directory, list files by date, filter result list to hide the first X # of results and remove remaining
      command_on: 'cd /share/Backups_Share/ && ls -A1t | tail -n +30 | xargs rm -v'

setup automation to run command_line daily

1 Like

nice.

I modified it to keep the last 6 backups for docker users:

shell_command:
  clean_backups: cd backups/ && ls -A1t | tail -n +7 | xargs rm

And, to run it:

- action: shell_command.clean_backups

The OneDriveBackup add-on does this also.

1 Like

Add this to your configuration.yaml

shell_command:
  delete_old_backups: "cd /backup && ls -A1t | tail -n +8 | xargs rm -v"

and this to your automations.yaml

- id: "delete_old_backups"
  alias: Delete Old Backups
  description: "Automatically delete old backups, keeping the newest 7"
  triggers:
    - trigger: time
      at: "03:00:00" # Runs daily at 3 AM. Adjust the time as needed.
  actions:
    - action: shell_command.delete_old_backups
  mode: single
stdout: ""
stderr: "/bin/sh: cd: line 0: can't cd to /backup: No such file or directory"
returncode: 2

Do you know what is happening?

On Advanced SSH Add-on:
image

do you see any files in that folder?

Hi all! I’ve already an automation that deletes older files from my reolink cam. The cam puts the video on my ftp add on from my home assistant.
It works similar to what you guys wanna do with deleting older files from backup. For my video files it works fine!

But i’m also interested in deleting old backups with an automation. But i’m not sure if this is a good plan to only delete the files on the disk. I mean I can delete those files that are older than 3 month for example in the backup folder:

image

But as you can see the naming is not clear. I would assume that ha saves the backups in a database when they are created and then assigns the appropriate names in the ha interface:

So if I now simply delete the file, is it also clean from the system and no longer appears there? Has anyone tried this before?
I could try it, but I don’t want to have problems afterwards that I could avoid :slight_smile:

I have now tested it briefly myself. Previously I downloaded a .tar file. In it is a packed file, which is the backup itself and the other file is a json file. It contains what the backup contains, including the name etc.
The file can simply be deleted from the backup folder and did not cause me any problems.