Disk almost full

Thanks for the response @KennethLavrsen. I have a sync to dropbox regarding the snapshots. How can I clean up hassio and free up disk space?

With respect to snapshot. They take place if you copy them to Dropbox and still leave a copy behind on HA. You have to delete them from the HA machine. How many old snapshots do you have?

They are stored in /backup seen from inside hassio. And you can delete them one by one from the HA user interface. In my HA each backup snapshot is 50 Mbytes so just 20 of those and a gigabyte is gone.

1 Like

I only have one backup left (deleted manually):

Disk size remains the same it seems

When I perform the df-h command, I have this output (showing) my 91% usage:

image

I really cannot get a hold on where it is coming from (been through the directories), but cannot find it. Maybe it is one level up and I am looking in a container now?

If you run du -h --max-depth=1 in the root directory ( / ) it should show you which directory is taking up so much space. For instance, if you see the directory /var taking up 23Gb, then run the same command inside of the /var directory to see where it is inside of that one.

While it is an overly manual process, this will atleast show you which directory or directories are taking up so much space. From there you can narrow down your troubleshooting to see why something like multiple dockers or your log files (both just examples) are eating up so much space.

Thanks for your response @jeffwcollins, much appreciated. Somehow the command --max-depth=1 is not supported

image

If I run this command, the output is as follows (I still do not see where the space is going):

image

Your /var directory should be MUCH bigger than that, with the assumption that your dockers are hosted under there.

df might not be traversing your running images, but to check, try running:
docker system df -v

Output of this command:

One more suggestion to the reference above. Can you add “-a” to the docker system prune command?

That will prune all images that doesn’t have at least one container tied to it. In layman, it prunes a bit deeper than the standard docker system prune command.

1 Like

That worked a little bit @jeffwcollins , now 85% usage instead of 93%, so there is more apparently

image

Still got a 23Gb somewhere

image

image

Are you running these commands via ssh or via the host itself? If via ssh, your “du” commands are coming from the ssh docker container and not the actual usage of the host.

I am executing these commands via SSH. Hassio is running on a Pi, I actually don’t know how to connect directly to the host.

Here you go, this will help get you to the HOST OS, and not just SSH’ing into the docker container.

I am by NO means a pro at Hassio, so forgive me for redirecting you to another post about this, but I have been a lurker for years, and have picked up a thing or two about the systems. Also know a bit about docker, so that helps, but again don’t take this as punting you to another person/post, it just explains it better than I can.

1 Like

You’re info is much appreciated @jeffwcollins and at least you found something, I didn’t (noob).

So, I am not sure when to enter the login command. I assume they mean at the CLI screen:

image

image

After authentication, this happens in the screen:

image

Did you add the ssh key?

HassOS based Hass.io

Use a USB drive formatted with FAT, ext4, or NTFS and name it CONFIG (case sensitive). Create an authorized_keys file (no extension) containing your public key, and place it in the root of the USB drive. File needs to be ANSI encoded (not UTF-8) and must have Unix line ends (LF), not Windows (CR LF). See Generating SSH Keys section below if you need help generating keys. From the UI, navigate to the hass.io system page and choose “Import from USB”. You can now access your device as root over SSH on port 22222. Alternatively, the file will be imported from the USB when the hass.io device is rebooted.

Make sure when you are copying the public key to the root of the USB drive that you rename the file correctly to authorized_keys with no .pub file extension.

You should then be able to SSH into your Hass.io device. On Mac/Linux, use:


You will initially be logged in to Hass.io CLI for HassOS where you can perform normal CLI functions. If you need access to the host system use the ‘login’ command. Hass.io OS is a hypervisor for Docker. See the Hass.io Architecturedocumentation for information regarding the Hass.io supervisor. The supervisor offers an API to manage the host and running the Docker containers. Home Assistant itself and all installed addon’s run in separate Docker containers.

Uh nope, let’s give it a try and try to find a USB key and put it on there

I am not able to solve this. Disk capacity shows 91,5% used, still don’t know where it is coming from. Help is more than welcome, thanks!

image

I really cannot figure this out somehow. Lost at the SSH keys thingy (collect or generate). Help needed, because running at 99,9% :frowning:

I found the root cause of the large files (by mounting the SD to my mac and using this article: https://www.jeffgeerling.com/blog/2017/mount-raspberry-pi-sd-card-on-mac-read-only-osxfuse-and-ext4fuse

image

image

It is definitely the Unifi database and tmp repair database files. So now it is about pruning the database from hassio, how can I do that? Can I access the container SSH into the docker container? Unfortunately I cannot remove files directly from the SD via the mac

1 Like

Did you find a solution to this? Really need to prune my unifi library from hassio