HASSIO Docker Overlay2 folder consuming all disk space

Hi Everyone,

same problem here,
trying to find a solution but so far nothing

Running
Server: Docker Engine - Community
Engine:
Version: 19.03.5
Hassio 0.108.6

The output from du -h /var/lib/docker | sort -h

I also have this exact same problem… Noticed on 0.109.0 version

Disk has filled up twice to me now. Killing the containers restores the disk space.

I’ll check if it’s the json.log next time.

1 Like

Have you tried ha su repair?

2 Likes

For me, no. That (ha su repair) is not available.

I’m working my way through space issues at the moment also… there doesn’t appear to be a single cause. I managed to access the “ha su repair” option by installing:
[Supervisor] - [Add-on store] - [Terminal & SSH]

This cleared up a little disk space

Same problem here. 20+ folders of up to 1GB. In each one it always seems that the python files take up basically all the space per folder.

This is my contribute to limit the used space.

On running HASSIO host machine:

Remove images not used by containers

docker image prune -a

Remove containers not used on last 24hours

docker container prune --filter "until=24h"

Remove volumes not used by containers

docker volume prune

Check space used by logs

journalctl --disk-usage

Remove journald log files

journalctl --rotate
journalctl --vacuum-time=1m

Free +10Gb on my PI3 with this configuration:

arch armv7l
chassis embedded
dev false
docker true
hassio true
host_os HassOS 4.10
installation_type Home Assistant
os_name Linux
os_version 4.19.126-v7
python_version 3.7.7
supervisor 227
timezone Europe/Rome
version 0.110.6
virtualenv false
12 Likes

Same problem here. The overlay2 folder is >12GB in size. What can be done to clean this up and prevent is from happening again in the future ?

I just noticed it because all my commands started failing and my load was very high. This was because no disk space was available anymore.

1 Like

Look one post above yours.

That did not solve it …

Just checked mine. It’s 14GB.

Isn’t that like, a lot ?
I mean, this means you can’t run the supervised method on a 16 gb card on a rpi …

The recommendation is 32 GB.

I’m having the same issue on a Supervised installation on Ubunto 18. Anyone find a solution for this? I’ve tried pruning already.

I have had this issue with other docker installs. What I discovered was that default linux autopartitioning limits the root partition size (where overlay2 lives) to a fixed amount (I think up to a max of 64GB) and any excess is given to home. So when installing linux you should pay attention to your partition scheme and either allocate more to root than autopart does or just not create a home partition at all.

There are also methods out there to extend your root partition that I have used after the fact by allocating another disk to the VM. But it is better to just start from scratch with a docker friendly partition scheme, if you can, IMHO.

Thanks.
I actually found out that it was another docker container using up my disk space.

On my hassio this command not found.
How you able to run it?

|arch |aarch64|
|chassis |embedded|
|dev |false|
|docker |true|
|docker_version |19.03.8|
|hassio |true|
|host_os |HassOS 4.11|
|installation_type |Home Assistant OS|
|os_name |Linux|
|os_version |4.19.127-v8|
|python_version |3.8.3|
|supervisor |229|
|timezone |Europe/Kiev|
|version |0.113.0|

Same problem here… I try to update home assistant, and it fill my entire disk with more than 20Go of data… I wansn’t abale anymore to login to my openmediavault server…

Did you find something to solve this problem ?

Thanks

Edit : Found it !!! It was the homeassistant database, the file home-assistant_v2.db, which was really heavy because of storing sensor state with a huge refresh rate!
Plus the local snapshot of Home assistant which was saving this huge database!

Same problem on HA 0.114 on Docker, not caused by HA db