High CPU usage after 0.113

Well, I don’t have a homekit installed but it is showing up in the logs as being loaded. Maybe it loads if it discovers anything Apple on the network?

Log line is: Setting up stage 2: {‘weather’, ‘group’, ‘sonos’, ‘input_boolean’, ‘tts’, ‘panel_custom’, ‘somfy_mylink’, ‘ssdp’, ‘zeroconf’, ‘cast’, ‘discovery’, ‘cover’, ‘roku’, ‘vesync’, ‘notion’, ‘script’, ‘synology_dsm’, ‘zwave’, ‘influxdb’, ‘stream’, ‘conversation’, ‘unifiprotect’, ‘media_player’, ‘rachio’, ‘hacs’, ‘sun’, ‘system_health’, ‘mqtt’, ‘persistent_notification’, ‘switch’, ‘logbook’, ‘homekit_controller’, ‘unifi’, ‘nest’, ‘history’, ‘ipp’, ‘sensor’, ‘automation’, ‘light’, ‘binary_sensor’, ‘tplink’, ‘updater’}

The ones in Integration UI are: vesync, google cast, HACS, MQTT, Nest, Notion, Roku, Sonos, Tp-Link Kasa Smart, Ubiquiti Unifi, Ubiquiti Protect, ZWave.

In addition to those, there’s a garadget and darksky module loaded through config.

I’ve sorted out the issue with homekit_controller (details -> https://github.com/home-assistant/core/issues/38668)

Please keep posting the py-spy record (svg) and dump (text) output as there is probably another integration that needs to be fixed.

Here you go. Gave it a few days for average CPU to steadily climb:

Thread 0x5608F5D338B0 (idle): "MainThread"
    select (selectors.py:468)
    _run_once (asyncio/base_events.py:1823)
    run_forever (asyncio/base_events.py:570)
    run_until_complete (asyncio/base_events.py:603)
    run (asyncio/runners.py:43)
    run (homeassistant/runner.py:119)
    main (homeassistant/__main__.py:312)
    <module> (homeassistant/__main__.py:320)
    _run_code (runpy.py:87)
    _run_module_as_main (runpy.py:194)
Thread 247 (idle): "Thread-1"
    dequeue (logging/handlers.py:1427)
    _monitor (logging/handlers.py:1478)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 248 (idle): "SyncWorker_0"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 251 (idle): "SyncWorker_1"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 252 (idle): "SyncWorker_2"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 253 (idle): "SyncWorker_3"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 254 (idle): "SyncWorker_4"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 255 (idle): "SyncWorker_5"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2326)
    request (aioesphomeapi/host_resolver.py:44)
    resolve_host (aioesphomeapi/host_resolver.py:64)
    run (concurrent/futures/thread.py:57)
    _worker (concurrent/futures/thread.py:80)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 256 (idle): "Recorder"
    wait (threading.py:302)
    get (queue.py:170)
    run (homeassistant/components/recorder/__init__.py:340)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7F334B954B20 (active): "zeroconf-Engine-257"
    update_record (zeroconf/__init__.py:1608)
    update_record (netdisco/mdns.py:21)
    update_record (zeroconf/__init__.py:2588)
    handle_response (zeroconf/__init__.py:2628)
    handle_read (zeroconf/__init__.py:1400)
    run (zeroconf/__init__.py:1309)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 258 (idle): "zeroconf-Reaper_258"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2326)
    run (zeroconf/__init__.py:1417)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7F334B5CCB20 (idle): "bellows.thread_0"
    select (selectors.py:468)
    _run_once (asyncio/base_events.py:1823)
    run_forever (asyncio/base_events.py:570)
    _thread_main (bellows/thread.py:28)
    run (concurrent/futures/thread.py:57)
    _worker (concurrent/futures/thread.py:80)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 260 (idle): "Thread-6"
    loop (paho/mqtt/client.py:1163)
    loop_forever (paho/mqtt/client.py:1782)
    _thread_main (paho/mqtt/client.py:3428)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 261 (idle): "zeroconf-ServiceBrowser__googlecast._tcp.local._261"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2326)
    run (zeroconf/__init__.py:1655)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 264 (idle): "Thread-10"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 265 (idle): "Thread-11"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 266 (idle): "Thread-12"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 267 (idle): "Thread-13"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 268 (idle): "Thread-14"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 269 (idle): "Thread-15"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 270 (idle): "Thread-16"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 271 (idle): "Thread-17"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 272 (idle): "Thread-18"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 273 (idle): "Thread-19"
    initialize_connection (pychromecast/socket_client.py:428)
    _check_connection (pychromecast/socket_client.py:672)
    run_once (pychromecast/socket_client.py:554)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 274 (idle): "Thread-20"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 275 (idle): "Thread-21"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 276 (idle): "Thread-22"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 281 (idle): "Thread-27"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 282 (idle): "Thread-28"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 283 (idle): "Thread-29"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 296 (idle): "job_queue"
    wait (threading.py:302)
    wait (threading.py:558)
    _main_loop (telegram/ext/jobqueue.py:284)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 297 (idle): "dispatcher"
    wait (threading.py:306)
    get (queue.py:179)
    start (telegram/ext/dispatcher.py:217)
    _thread_wrapper (telegram/ext/updater.py:156)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 299 (idle): "updater"
    read (ssl.py:1099)
    recv_into (ssl.py:1241)
    readinto (socket.py:669)
    _read_status (http/client.py:264)
    begin (http/client.py:303)
    getresponse (http/client.py:1332)
    _make_request (telegram/vendor/ptb_urllib3/urllib3/connectionpool.py:398)
    urlopen (telegram/vendor/ptb_urllib3/urllib3/connectionpool.py:614)
    urlopen (telegram/vendor/ptb_urllib3/urllib3/poolmanager.py:244)
    request_encode_body (telegram/vendor/ptb_urllib3/urllib3/request.py:148)
    request (telegram/vendor/ptb_urllib3/urllib3/request.py:68)
    _request_wrapper (telegram/utils/request.py:203)
    post (telegram/utils/request.py:307)
    get_updates (telegram/bot.py:1975)
    decorator (telegram/bot.py:65)
    polling_action_cb (telegram/ext/updater.py:278)
    _network_loop_retry (telegram/ext/updater.py:319)
    _start_polling (telegram/ext/updater.py:297)
    _thread_wrapper (telegram/ext/updater.py:156)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 300 (idle): "f0528f02-d625-43cf-9ab5-f91e7626b1c8_0"
    wait (threading.py:302)
    get (queue.py:170)
    _pooled (telegram/ext/dispatcher.py:154)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 301 (idle): "f0528f02-d625-43cf-9ab5-f91e7626b1c8_1"
    wait (threading.py:302)
    get (queue.py:170)
    _pooled (telegram/ext/dispatcher.py:154)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 304 (idle): "f0528f02-d625-43cf-9ab5-f91e7626b1c8_2"
    wait (threading.py:302)
    get (queue.py:170)
    _pooled (telegram/ext/dispatcher.py:154)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 305 (idle): "f0528f02-d625-43cf-9ab5-f91e7626b1c8_3"
    wait (threading.py:302)
    get (queue.py:170)
    _pooled (telegram/ext/dispatcher.py:154)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 307 (idle): "Thread-42"
    read_events (watchdog/observers/inotify_c.py:296)
    run (watchdog/observers/inotify_buffer.py:59)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 308 (idle): "Thread-3"
    wait (threading.py:302)
    get (watchdog/utils/delayed_queue.py:54)
    read_event (watchdog/observers/inotify_buffer.py:43)
    queue_events (watchdog/observers/inotify.py:129)
    run (watchdog/observers/api.py:146)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 309 (idle): "Thread-2"
    wait (threading.py:306)
    get (queue.py:179)
    dispatch_events (watchdog/observers/api.py:360)
    run (watchdog/observers/api.py:199)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 316 (idle): "Thread-50"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 317 (idle): "Thread-51"
    run_once (pychromecast/socket_client.py:561)
    run (pychromecast/socket_client.py:531)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 45982 (idle): "zeroconf-Engine-45982"
    run (zeroconf/__init__.py:1304)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 45983 (idle): "zeroconf-Reaper_45983"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2326)
    run (zeroconf/__init__.py:1417)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 255 (idle): "SyncWorker_5"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2326)
    request (aioesphomeapi/host_resolver.py:44)
    resolve_host (aioesphomeapi/host_resolver.py:64)
    run (concurrent/futures/thread.py:57)
    _worker (concurrent/futures/thread.py:80)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)

It looks like aioesphomeapi is starting another Zeroconf engine.

https://github.com/home-assistant/core/issues/38248#issuecomment-671098906 suggested adding logging when this is detected. Will see if we can get something added.

I’ll also look into getting aioesphomeapi to use the shared Zeroconf instance

1 Like

Esphome changes


1 Like

Another report here indicating the Ikea component may also have some bearing on the issue:

Similar issue when updating from 0.112.4 to 0.113.3 (HA supervised).
The cpu usage is rising over time, a reboot “resets” the cpu usage.

Integrations:

  • Certificate Expiry
  • Google Cast
  • IFTTT
  • mobile_app
  • xiaomi_aqara

dump, top and record available here:

How many cast devices do you have?

14 devices and 11 entities according to Google Cast integration. The majority of them are offline and only powered on and connected to the network when needed (Chromecast audio/video, Android TV set top boxes) and a few are always on (Google Home)

There isn’t anything that really stands out in the data. What % had cpu usage risen to when the data was captured?

It’s rising up to 30% after a day (when the data was captured). It’s usually around 3% after a restart of HA and used to stay around that value on 0.112.4

Would you please capture another one when it gets around 70-75%

I had to roll back to 0.112.0. I was getting only about 2 hours of runtime before 0.113.3 would become completely unresponsive. I’m running an HA docker image with a Supervisor docker image. I was running the latest versions available at the time. I’m not running any of the components mentioned above as possible causes. I’ll do some more troubleshooting next week if the issue isn’t identified before them.

I do have a similar experience. CPU usage started to go up on 0.113.0 and even removing almost all integrations didn’t help. I have now moved to 0.114.0 but same result here.

The Sonos integration seems to add quite a bit to CPU usage. But even when removing it, CPU usage stays quite high (~10-50% on [email protected] while nothing happens).

Py-Spy results for the last minutes where CPU-usage was about 45%.

I’ve now increased logging to “info” just to make sure there is nothing happening but I still can’t see what Home-Assistant is doing.

Some informations about my installation:

  • HASS in Docker on Intel NUC (i7 CPU)
  • Recording to MariaDB
  • KNX and MQTT Integrations, iRobot + two iOS devices
  • HACS is installed but all custom integrations uninstalled (except for HACS itself)

I’m running out of ideas :frowning:

There wasn’t anything apparent in the py-spy recording. A thread dump from py-spy might reveal if there is an integration creating a lot of threads.

This is the output of the dump command. I executed it at the moment when CPU usage was at about 80%.

About a minute later the CPU usage went down again to about 10%. Looks like I get about one spike of high CPU usage every 2 minutes or so. At the same time I see an increase in network traffic (incoming and outgoing).

On MQTT I do have an OpenMQTTGateway, one Zigbee2MQTT Gateway and 5 Tasmota devices. Not much going on there.

Process 5995: python3 -m homeassistant --config /config
Python v3.8.3 (/usr/local/bin/python3.8)

Thread 0x560CE284B8B0 (idle): "MainThread"
    select (selectors.py:468)
    _run_once (asyncio/base_events.py:1823)
    run_forever (asyncio/base_events.py:570)
    run_until_complete (asyncio/base_events.py:603)
    run (asyncio/runners.py:43)
    run (homeassistant/runner.py:119)
    main (homeassistant/__main__.py:312)
    <module> (homeassistant/__main__.py:320)
    _run_code (runpy.py:87)
    _run_module_as_main (runpy.py:194)
Thread 0x7FEA28243B20 (active): "Thread-1"
    dequeue (logging/handlers.py:1427)
    _monitor (logging/handlers.py:1478)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA2813FB20 (active): "SyncWorker_0"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA272FFB20 (active): "SyncWorker_1"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA269BFB20 (active): "SyncWorker_2"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA263FFB20 (active): "SyncWorker_3"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA25DFFB20 (active): "SyncWorker_4"
    scan (netdisco/ssdp.py:259)
    run (concurrent/futures/thread.py:57)
    _worker (concurrent/futures/thread.py:80)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA257FFB20 (active): "SyncWorker_5"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA251FFB20 (active): "SyncWorker_6"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA24BFFB20 (active): "SyncWorker_7"
    _worker (concurrent/futures/thread.py:78)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA23EBFB20 (idle): "Recorder"
    wait (threading.py:302)
    get (queue.py:170)
    run (homeassistant/components/recorder/__init__.py:338)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA237FFB20 (active): "Thread-2"
    dequeue (logging/handlers.py:1427)
    _monitor (logging/handlers.py:1478)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA216D9B20 (active): "zeroconf-Engine-275"
    update_record (zeroconf/__init__.py:1612)
    update_record (netdisco/mdns.py:21)
    update_record (zeroconf/__init__.py:2622)
    handle_response (zeroconf/__init__.py:2662)
    handle_read (zeroconf/__init__.py:1405)
    run (zeroconf/__init__.py:1314)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA215D5B20 (idle): "zeroconf-Reaper_276"
    wait (threading.py:306)
    wait (zeroconf/__init__.py:2360)
    run (zeroconf/__init__.py:1422)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA1FDD5B20 (active): "Thread-6"
    loop (paho/mqtt/client.py:1163)
    loop_forever (paho/mqtt/client.py:1782)
    _thread_main (paho/mqtt/client.py:3428)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)
Thread 0x7FEA1A4EDB20 (active): "Thread-7"
    loop (paho/mqtt/client.py:1163)
    loop_forever (paho/mqtt/client.py:1782)
    _thread_main (paho/mqtt/client.py:3428)
    run (threading.py:870)
    _bootstrap_inner (threading.py:932)
    _bootstrap (threading.py:890)

In this case it would be best to get a py-spy recording with a duration of at least 120 seconds

I have:
Home Assistant 0.114.1

arch aarch64
chassis embedded
dev false
docker true
docker_version 19.03.8
hassio true
host_os HassOS 5.0
installation_type Home Assistant OS
os_name Linux
os_version 4.19.127-v8
python_version 3.8.3
supervisor 234

Using System Monitor platform:
pi4

@bdraco one thing I have noticed is that the CPU utilisation climbs a fraction higher with each and every event from any platform. Could this be a result of the recent listener optimisations?

That code is very straight forward, so it’s not that. I wish it were as then we wouldn’t be waiting on upstream merges to fix the known ones.

1 Like