Well, if it is done properly, the API needs to allow a compatibility for some time e.g. via creating new version of the API while still allowing the old one to operate.
Look at HACS. As long as the integration is maintained, it’s usually smooth.
I believe it is already the case. The python code is already there for all integrations the moment you install HA. It is just executed only when you actually have a specific integration configured. There is also certain level of backwards compatibility assured by HA core developers, hence a number of warnings about deprecated functions, features etc and only after a while they stop working.
question is can integrations be (re)loaded in-runtime. That was my point.
If so, then half of this WTH is completed.
Some can some cannot…
This decoupling would be backwards compatibility testing hell.
Pipe dream is right (from my software engineering pov)
Whilst I agree this is likely a pipe dream - not making this happen puts a hand brake on what Home Assistant can achieve in the future. Realistically there are a finite number of integrations that can make it into core in the current monolith and so you are limiting what HA can achieve.
Then what determines what makes it into core?
Shouldn’t core be the most popular integrations and then HACs for the rest?
Shouldn’t core have the integrations from biggest players in the smart home industry?
It would be useful to understand but I think it boils down to integrations that got in the door early (e.g. Monoprice Blackbird Matrix Switch - Home Assistant which is a very niche integration) or developers that are willing to spend the effort to get things into core.
I believe the ones that don’t have the time to get into core just publish on HACs and we’ve seen the HACs model can work.
An example is the lightwave integration - core has v1 which is now very old but the v2 version is only on HACs (GitHub - bigbadblunt/homeassistant-lightwave2: Lightwave RF custom component for Home Assistant. Requires generation 2 ("Link Plus") hub, but will control both generation 1 ("Connect Series") and generation 2 ("Smart Series") devices.). To me this example shows that core might become less and less relevant and so moving from the monolith feels the right way to go as being in core doesn’t mean the most up-to-date.
Not really. Any integration can be made custom and treated like it’s own package without any api changes. If you don’t want breaking changes, make a replica of the working integration, put it in custom_components in your config directory, and never make changes to it. You now have a ‘separate’ integration that doesn’t change with HA.
This reasoning alone is why it doesn’t even make sense to entertain this WTH. We already have a system that can treat HAs integrations like separate packages. You’re using it with the lightwave v2 on hacs.
Why would anyone bother changing the API to get almost the exact same functionality? The only net gain to separating this would be the ability to keep an integration running while HA is not, and a decrease in startup time (assuming the integration is running).
Isn’t that the point of the WTH?
That is what it means to me.
Most (all?) the integrations could be moved to HACs making the core smaller and meaning a core upgrade can happen independent of the integrations?
In my mind core integrations only make sense if they are the biggest most popular or fundamental to the way Home Assistant runs.
95% sure that maxyms intentions are that HA adopts a API similar to OpenHAB so that each integration is compartmentalized and running separately. Not what you’re saying which is to simply deprecate all the auxiliary integrations.
Not to speak for anyone else, but separating integration updates from the core is what this WTH means to me. Whether they’re compartmentalized and running separately is irrelevant to the user.
I know making a copy of an integration and keeping it apart from the OS updates can be done now. I’ve done it. It’s not something a new or non-technical user is going to want to do. It also leaves no way to know when the integration has been updated, beyond going out to github and checking regularly. To do this with all integrations I use would be a nightmare.
I like to think that HA aspires to be more user friendly than all that.
The way I envision this working is similar to the way my cell phone works. In HA, I’d want to be notified when an integration I use was ready to update. I’d update it when I had time to test it in my environment. If the update broke something, I’d simply back it out and wait for a fix.
Currently, I have to wait until I have time to test everything, then update it all at once. If one integration had a problem, I’d have to back out the entire OS, along with every other integration, while I’m waiting for the fix. I’ve done this and it’s very frustrating.
Well what you’re requesting has nothing to do with this being a monolith then. You simply want versioned integrations. A monolith can do that btw
@maxym it would be interesting to get clarity on that - as that is not how I read the WTH e.g. this quote:
I guess are we talking about removing most of the integrations from core and then allowing users to install them separately like HACS whilst ensuring all integrations implement an API (the same way a plugin system might work).
Or is this more suggesting we are running something a bit like the HA Supervisor addon modules where all integrations are independent processes/or containers/or something conceptually similar and interact with the core via an API and that they can restart on their own.
I’d be very much in favour of the first option. I agree the second option seems overkill and I don’t see the value.
You have a point “monolith” might be the wrong word… I’d want a core release to not update my integrations and I’d want to update my own integrations manually as and when the author has an update and not have to wait for a core releases (without getting into the weeds of custom components taken from github forks).
I understand there will be occasions when a core change does mean all integrations have to be changed and that would need to be managed in a similar way to the way deprecation of things is managed now.
Running separately means separated from core entirely. That’s how I read this. See bullet 3.
I get why you’d want a standard API if we were starting from a clean slate. But I’m not sure why that would be necessary. The integrations we have today can already be updated independently as @petro describes. Instead of being in the core, the exact same code would simply be moved to custom_components. HA already looks there first for integrations.
Wouldn’t the only core change needed be a way to notify users that there’s an update, and copy it down for them? Basically, what HACS already does.
There may have been better ways to do this in a perfect world, but I’d hate to see some unattainable ideal prevent moving forward with what we’ve got.
IMO you guys are going too much into technical details. WTH for obvious reasons it’s not a hackaton. it’s a way how users request changes seen from a user perspective. Thus I don’t care which way my needs will be fulfilled. I don’t care wether a module will run separate process, thread or will be included j to main program loop. Obviously there are more and less sufficient solutions but I’m not here to advice them. It might need strategic decision of project owner.
Speaking about “monolith”. Currently HA behaves like a monolith. Doesn’t matter their parts are organized into separate files. Coincidently the app is not compiled, so there is possibility to modify its functionality after deployment. Coincidently, otherwise:
HA is maintained as a whole, deployed as a whole, in general single modules cannot be un/re/loaded in the runtime - thus monolith.
My needs from OP requires
- ability of maintaining and deploy modules separately
- ability to load/reload modules after deployment without restarting the core nor other not dependent components
- Point 1 obviously requires some api with some retention policy. Not saying about what kind of api, there are several options. Also definitively not saying about one and only version of api.
Again: not trying to say how to achieve that (you guys are trying to matchup available solutions to the needs while no of existing solutions provides what I’m describing). At that point the needs are what we should focus on. let making solution to devs
Might not be in line with the WTH but i recently thought about something like this:
Would it not be nice if a PR is open for a certain component that you could just click on that component in HA (say a toggle beta/development) and that it query’s github for open PR’s with that component name. So that anyone (even less technical people) could integrate/test a new PR just for the sake of willing to help out with testing. I don’t know if that is possibel in a monolith or “at all” in my head it sounded nice enough to place it somewhere.
What you’re describing is a plugin architecture as it is very common in commercial software. Applications like Photoshop, Illustrator, AutoCAD, 3DSMay, Maya, they all support plugins in pretty much exactly the way you layed out your three points. There’s backwards compatibility on plugins for a significant number of versions, often over many years (though typically no forward compatibility). A lot of the core functionality of these applications are written as plugins and of course external third party ones too.
In my day to day job as a software engineer, I work on a large commercial application that is entirely based on plugins. This type of architecture really is an industry standard. I have no idea how to do that in Python (I try to stay away from that terrible language as far as I possibly can), but in C++ creating an extensible modular API that still stays backwards compatible is pretty straightforward through pure virtual interfaces. The same is true in Java and C#. So I assume that it should be possible in Python too and that not doing so is a choice by the HA developers, not a necessity.
yes, with exception that some of those plugin-supporting apps still require restart of the app (imo mentioned Photoshop unless it didn’t change last years). There are obviously other applications which are able to load and run external modules during runtime (DAWs loading instruments and effects)
Ability to add/load/reload modules in runtime is one of crucial needs of my OP. As you said it’s not a rocket science (based on similar experience with other laguages to your)
thank you