Automation to remove old backups

I have an automation set up to make nightly backups to the “backup” directory, but want to limit the number of backups to 2, to save disc space.

Here’s what I’ve done.

Script in config directory, named “remove_old_backups.py”

import os

backup_directory = "/backup"  # Replace with the actual path to backup directory
backups_to_keep = 2  # Number of backups to keep

backup_files = [filename for filename in os.listdir(backup_directory) if filename.endswith(".tar") or filename.endswith(".tar.gz")]
backup_files.sort(key=os.path.getmtime, reverse=True)

for filename in backup_files[backups_to_keep:]:
    filepath = os.path.join(backup_directory, filename)
    os.remove(filepath)
    print(f"Deleted old backup: {filename}")

new lines at end of configuration.yaml file:

# Remove old backups
shell_command:
  remove_old_backups: 'python /config/remove_old_backups.py'

Automation file:

alias: Remove Excess Backup
description: ""
trigger:
  - platform: time
    at: "03:00:00"
condition: []
action:
  - service: shell_command.remove_old_backups
    data: {}
mode: single

Trace result from running the automation:

Executed: 13 August 2023 at 14:45:16
Result:

params:
  domain: shell_command
  service: remove_old_backups
  service_data: {}
  target: {}
running_script: false

The old backups are not removed.
I have tried running the service directly from “developer tools” >> “services”. A green tick appears, which I guess means that the script has run correctly. But no backups deleted.
Feel that I should be able to get some more logs to track down the issue, but can’t find any…
Any ideas?

Doesn’t address your question directly, but the add-on Samba Backup makes scheduled backups and removes old ones if required.

@Stiltjack I might have to do that. I’m currently using Remote Backup addon to try and make local backups and copy them to a remote location. The remote bit isn’t working and neither is the remove old backups bit…

It’s a long time since I set mine up, but it makes a local backup, then makes a copy to my NAS before deleting any unwanted backups locally. The Synology NAS also automatically sends a copy to Dropbox.

Alter the script to this:

import os

backup_directory = "/backup"  # Replace with the actual path to backup directory
backups_to_keep = 2  # Number of backups to keep

backup_files = [backup_directory+"/"+filename for filename in os.listdir(backup_directory) if filename.endswith(".tar") or filename.endswith(".tar.gz")]
backup_files.sort(key=os.path.getmtime, reverse=True)
print(backup_files)

for filename in backup_files[backups_to_keep:]:
    os.remove(filename)
    print(f"Deleted old backup: {filename}")

then it works.

Reason is, that you dont put the path of the files in the list, and then it cant be sorted, because it doesnt find the files. If you execute the script in the backup_directory folder it works, but from the automation, where the script is in config direcotry it does not work.