Home Assistant Supervised (previously known as Hass.io) on Synology DSM as native package (not supported or working atm)

I guess then that DSM before 6.2 is not working? Why using this instead of just using ordinary image on docker?

If you don’t know what you’re doing with Watchtower, disable it. It has to be configured in such a way that it doesn’t touch (update) any hass.io related container. It could break your install.
To keep it, configure it to update only specific containers that you have that are not homeassistant, hassio_ or addon.

There is a difference between Home Assistant Core that you mention and Home Assistant Supervised (hass.io package) - you can check it in documentation.
If you don’t know what’s the difference, it’s better to read first and understand installation/version options.

Have you turned on Advanced Mode under your User Profile?

You get more options when it’s Turned On

This> https://www.home-assistant.io/docs/? Doesnt say much about supervised, and in what context you use that word.
I have not mentions Home Assistant Core, thats a new word. also Blue print was not there when I tried the raspy image.

This, ordinary image as you called it is Home Assistant Core - Home Assistant vs. Home Assistant Core - Home Assistant

Hassio (you can still see the name in the URL above) is Home Assistant Supervised, is also called Home Assistant (without core).

Core version is Docker image that runs only that - main core program.
hassio = Supervised = HA - this image runs whole stack (minus OS in this case) that has HA, Supervisor, DNS, AddOns,…

I ran this docker network recreation script yesterday on my DSM 6.2.3-25426 and now Home Assistant as well as the Core SSH addon don’t seem to be able to resolve domains anymore. “ping 8.8.8.8” works fine, “ping www.google.com” doesn’t. All HA integrations fail to fetch their data…

Uninstalling the Synology package and reinstalling it again fixed it within a few minutes. All data was still there of course. Will not run the script again :slight_smile:

Update: now the same issues happen again, but this time it’s obvious that the CoreDNS plugin cannot start. See the logs:

2020-12-22 08:40:51	stderr	e[31m20-12-22 08:40:51 ERROR (MainThread) [supervisor.misc.tasks] CoreDNS watchdog reanimation failed!e[0m
2020-12-22 08:40:51	stderr	e[31m20-12-22 08:40:51 ERROR (MainThread) [supervisor.plugins.dns] Can't start CoreDNS plugine[0m
2020-12-22 08:40:51	stderr	e[31m20-12-22 08:40:51 ERROR (SyncWorker_2) [supervisor.docker] Can't start hassio_dns: 403 Client Error for http+docker://localhost/v1.39/containers/e3f30845796f2c171e14b090d607466b146ac19aa876536f42ff54efb049b9d7/start: Forbidden ("Address already in use")e[0m
2020-12-22 08:40:49	stderr	e[32m20-12-22 08:40:48 INFO (SyncWorker_2) [supervisor.docker.interface] Cleaning hassio_dns applicatione[0m
2020-12-22 08:40:48	stderr	e[32m20-12-22 08:40:48 INFO (MainThread) [supervisor.plugins.dns] Starting CoreDNS plugine[0m
2020-12-22 08:40:48	stderr	e[33m20-12-22 08:40:48 WARNING (MainThread) [supervisor.misc.tasks] Watchdog found a problem with CoreDNS plugin!e[0m
2020-12-22 08:40:18	stderr	e[31m20-12-22 08:40:18 ERROR (MainThread) [supervisor.misc.tasks] CoreDNS watchdog reanimation failed!e[0m
2020-12-22 08:40:18	stderr	e[31m20-12-22 08:40:18 ERROR (MainThread) [supervisor.plugins.dns] Can't start CoreDNS plugine[0m
2020-12-22 08:40:18	stderr	e[31m20-12-22 08:40:18 ERROR (SyncWorker_0) [supervisor.docker] Can't start hassio_dns: 403 Client Error for http+docker://localhost/v1.39/containers/5d79cdb6ce97d72f609064f59c8912781b7e8fd1fac87bb398b92cea78b44772/start: Forbidden ("Address already in use")e[0m
2020-12-22 08:40:16	stderr	e[32m20-12-22 08:40:16 INFO (SyncWorker_0) [supervisor.docker.interface] Cleaning hassio_dns applicatione[0m
2020-12-22 08:40:16	stderr	e[32m20-12-22 08:40:16 INFO (MainThread) [supervisor.plugins.dns] Starting CoreDNS plugine[0m
2020-12-22 08:40:15	stderr	e[33m20-12-22 08:40:15 WARNING (MainThread) [supervisor.misc.tasks] Watchdog found a problem with CoreDNS plugin!e[0m
1 Like

Thanks for providing this package!

After setting up I can see various errors in the docker logs, e.g. for homeassistant:

|e                  |stream|content
|2020-12-22 15:06:01|stderr|e[31m2020-12-22 16:06:01 ERROR (MainThread) [homeassistant.components.hassio.http] Client error on api app/entrypoint.js request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:46|stderr|e[31m2020-12-22 16:05:46 ERROR (MainThread) [homeassistant.components.hassio.http] Client error on api app/entrypoint.js request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:20|stderr|e[31m2020-12-22 16:05:20 ERROR (MainThread) [homeassistant.components.hassio.discovery] Can't read discover info: e[0m|
|2020-12-22 15:05:20|stderr|e[31m2020-12-22 16:05:20 ERROR (MainThread) [homeassistant.components.hassio.handler] Client error on /discovery request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:18|stderr|TypeError: 'NoneType' object is not subscriptablee[0m|
|2020-12-22 15:05:18|stderr|    newest = core_info["version_latest"]|
|2020-12-22 15:05:18|stderr|  File "/usr/src/homeassistant/homeassistant/components/updater/__init__.py", line 82, in check_new_version|
|2020-12-22 15:05:18|stderr|    return await self.update_method()|
|2020-12-22 15:05:18|stderr|  File "/usr/src/homeassistant/homeassistant/helpers/update_coordinator.py", line 132, in _async_update_data|
|2020-12-22 15:05:18|stderr|    self.data = await self._async_update_data()|
|2020-12-22 15:05:18|stderr|  File "/usr/src/homeassistant/homeassistant/helpers/update_coordinator.py", line 144, in async_refresh|
|2020-12-22 15:05:18|stderr|Traceback (most recent call last):|
|2020-12-22 15:05:18|stderr|e[31m2020-12-22 16:05:18 ERROR (MainThread) [homeassistant.components.updater] Unexpected error fetching Home Assistant update data: 'NoneType' object is not subscriptablee[0m|
|2020-12-22 15:05:17|stderr|e[31m2020-12-22 16:05:17 ERROR (MainThread) [homeassistant.components.hassio.addon_panel] Can't read panel info: e[0m|
|2020-12-22 15:05:17|stderr|e[31m2020-12-22 16:05:17 ERROR (MainThread) [homeassistant.components.hassio.handler] Client error on /ingress/panels request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:14|stderr|e[33m2020-12-22 16:05:13 WARNING (MainThread) [homeassistant.components.hassio] Can't read last version: e[0m|
|2020-12-22 15:05:14|stderr|e[31m2020-12-22 16:05:13 ERROR (MainThread) [homeassistant.components.hassio.handler] Client error on /info request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:10|stderr|e[31m2020-12-22 16:05:10 ERROR (MainThread) [homeassistant.components.hassio.handler] Client error on /supervisor/options request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|
|2020-12-22 15:05:07|stderr|e[31m2020-12-22 16:05:07 ERROR (MainThread) [homeassistant.components.hassio.handler] Client error on /homeassistant/options request Cannot connect to host 172.30.32.2:80 ssl:default [Connect call failed ('172.30.32.2', 80)]e[0m|

Also I run into the “Unable to load the panel source: /api/hassio/app/entrypoint.js” issue - even after tryxing your fix.

My guess: it conflicts with my “freeing” of port 80 of Synology (see http://tonylawrence.com/posts/unix/synology/freeing-port-80/ ) so that I can access with the Philips Hue App? So I probably need to rollback this change? Or is there any way to make use of another port than 80 via some configuration?

I love the package, but I cant get it to install properly anymore. I updated Docker to 20.10.1, then install the package and nothing works.

Anyone else do the same and get it to work?

I have rolled back the change to use another port than 80 in my Synology. Still no change and same error message.

I can see the following in the hassio_supervisor log in addition to the previous log output:

|date               |stream|content|
|2020-12-23 16:07:12|stdout|[s6-finish] sending all processes the TERM signal.|
|2020-12-23 16:07:12|stdout|[s6-finish] waiting for services.|
|2020-12-23 16:07:12|stdout|[cont-finish.d] done.|
|2020-12-23 16:07:12|stdout|[cont-finish.d] executing container finish scripts...|
|2020-12-23 16:07:12|stdout|client_session: <aiohttp.client.ClientSession object at 0x7f04ec4639a0>e[0m|
|2020-12-23 16:07:12|stdout|e[31m20-12-23 16:07:12 ERROR (MainThread) [asyncio] Unclosed client session|
|2020-12-23 16:07:12|stdout|client_session: <aiohttp.client.ClientSession object at 0x7f04ec463730>e[0m|
|2020-12-23 16:07:12|stdout|e[31m20-12-23 16:07:12 ERROR (MainThread) [asyncio] Unclosed client session|
|2020-12-23 16:07:12|stdout|docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))|
|2020-12-23 16:07:12|stdout|    raise DockerException(|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 221, in _retrieve_server_version|
|2020-12-23 16:07:12|stdout|    self._version = self._retrieve_server_version()|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 197, in __init__|
|2020-12-23 16:07:12|stdout|    self.api = APIClient(*args, **kwargs)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/client.py", line 45, in __init__|
|2020-12-23 16:07:12|stdout|    self.docker: docker.DockerClient = docker.DockerClient(|
|2020-12-23 16:07:12|stdout|  File "/usr/src/supervisor/supervisor/docker/__init__.py", line 87, in __init__|
|2020-12-23 16:07:12|stdout|    self._docker: DockerAPI = DockerAPI()|
|2020-12-23 16:07:12|stdout|  File "/usr/src/supervisor/supervisor/coresys.py", line 63, in __init__|
|2020-12-23 16:07:12|stdout|    coresys = CoreSys()|
|2020-12-23 16:07:12|stdout|  File "/usr/src/supervisor/supervisor/bootstrap.py", line 60, in initialize_coresys|
|2020-12-23 16:07:12|stdout|    return future.result()|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete|
|2020-12-23 16:07:12|stdout|    coresys = loop.run_until_complete(bootstrap.initialize_coresys())|
|2020-12-23 16:07:12|stdout|  File "/usr/src/supervisor/supervisor/__main__.py", line 41, in <module>|
|2020-12-23 16:07:12|stdout|    exec(code, run_globals)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code|
|2020-12-23 16:07:12|stdout|    return _run_code(code, main_globals, None,|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/runpy.py", line 194, in _run_module_as_main|
|2020-12-23 16:07:12|stdout|Traceback (most recent call last):|
|2020-12-23 16:07:12|stdout||
|2020-12-23 16:07:12|stdout|During handling of the above exception, another exception occurred:|
|2020-12-23 16:07:12|stdout||
|2020-12-23 16:07:12|stdout|requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))|
|2020-12-23 16:07:12|stdout|    raise ConnectionError(err, request=request)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/requests/adapters.py", line 498, in send|
|2020-12-23 16:07:12|stdout|    r = adapter.send(request, **kwargs)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 655, in send|
|2020-12-23 16:07:12|stdout|    resp = self.send(prep, **send_kwargs)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 542, in request|
|2020-12-23 16:07:12|stdout|    return self.request('GET', url, **kwargs)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 555, in get|
|2020-12-23 16:07:12|stdout|    return self.get(url, **self._set_request_timeout(kwargs))|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 237, in _get|
|2020-12-23 16:07:12|stdout|    return f(self, *args, **kwargs)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/utils/decorators.py", line 46, in inner|
|2020-12-23 16:07:12|stdout|    return self._result(self._get(url), json=True)|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/api/daemon.py", line 181, in version|
|2020-12-23 16:07:12|stdout|    return self.version(api_version=False)["ApiVersion"]|
|2020-12-23 16:07:12|stdout|  File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 214, in _retrieve_server_version|
|2020-12-23 16:07:12|stdout|Traceback (most recent call last):|

Hi, I have problem with synology and zigbee2mqtt suddenly. It seems to be related with latest version of Hass.io. Do you suffer same problem ? I’m running a CC2531 with official zigbee2mqtt addons.

What kind of a problem? I don’t see at first possibility for those two and issues to be related to this package.

Can’t help you - Docker 20.10.1 is not supported on Synology AFAIK. You’ve brought third unsupported item into equation :slight_smile:

Anybody using the new DSM version 7?
I have just installed it on my synology DS920+ and unfortunatly now I am unable to install this great package:

It says @fredrike has to remove root privileges from the package :o
And thank you so much for this great package :slight_smile: Merry christmas

1 Like

@BeardedConti I don’t know. My motion sensor are detected, but the state is not updated. The connection with MQTT is working because I see a new connection with the name ‘addon’. But zigbee device is not working anymore and I don’t know how to debug it. I have no logs. Do you have an idea what can be wrong ? Is it the CC2531 not detected ? There is no led on the device, even if I put disable_led: false on the configuration, thats a bit weird to me.
Do we still need to use: https://github.com/zigbee2mqtt/hassio-zigbee2mqtt ?
My zigbee2mqtt logs show nothing when I try to pair a device. I try to found debug log, but see nothing.
Just in case my CC version is: Coordinator firmware version: ‘{“meta”:{“maintrel”:2,“majorrel”:2,“minorrel”:7,“product”:2,“revision”:20190425,“transportrev”:2},“type”:“zStack30x”}’

Hi!

After many months researching wich software to use in my home automation setup and finally setteling on Home Assistant your guide has been invaluable to me, thank you Fredrik! :slight_smile:

I tried to send you a PM (in acordance with post #2) for more info on the known issue of the official Deconz addon not working, but i just joined this forum and maby i am blind, but i cannot find how to send PM here, so i have no choice but to write here instead.

I would greatly apprichiate help in this matter.

And a merry christmas and happy new year to you all!

You can try getting help for this at ZigBee2mqtt support pages - I know there is a Discord server, as I don’t think this is related to this package.
But I would suggest you check distance between sensor and stick. Also do you have any additional zigbe device that is not powered by battery (for ex. Light bulb, or power switch).
And yes you need for this home assistant, MQTT and also ZigBee2mqtt - those are required components.
Btw, what version of firmware are you using, looks like ZStack 3.0. For cc2531 I think it’s recommended to go with 1.2

yes but the problem is that with 1.2 it was not working, that’s why I migrate to 3.0. I have ordered 3 more CC to test against my installation. I was able to make it work again with only deleting all the previous files for zigbee2mtt and reinstall everyting.

Well, I don’t know how to run this without root privileges.

(If someone finds a fix or a guide on how to migrate packages for DSM 7 I could try to fix it but will stay on DSM 6 myself for some time).

2 Likes

Same here - probably till 7.1 or something close to it. As with DSM 6, it will take some time for things to settle down and all apps be converted.
Unfortunately, I don’t see anything new in DSM 7 from user perspective (except new photo app) - there are great things for Enterprise/company usage, but not for end users.