I would like to enhance the nanoleaf integration to allow HA to register Touch events on the new Canvas tiles. There are two interfaces for this: One is a REST interface where events are streamed over an open GET request, and the other is a low-latency UDP interface (I am currently trying to figure out from the devs whether this interface really exists).
I have not developed an HA component previously (although I have some private branches of existing components with added functionality). What I’m looking for, if someone has advice, is the following: An already-implemented component that correctly integrates some kind of event stream into HA–preferably something simple that reflects the “correct” underlying design principles, just so I can see how various objects are registered.
I integrated the whole nanoleaf features in ha using node red. And read the whole api documents of nanoleaf I couldn’t find anything on touch. I don’t thinks it’s possible to listen to touch events. If this should work then the nanoleaf tiles should need to support some kind of push feature to push info over the network. This could work with UDP multicast features but nanoleaf does not support such a thing. At least I can’t find anything about it. So for now you are stuck with telling the tiles what to do. I know nanoleaf has very active devs so I wouldn’t be surprised if it will be added later on
Developers can register and listen to events from Nanoleaf Devices using Server-Sent Events (SSE).
Event Type
Event Type Id
State:
1
Layout
2
Effects
3
Touch:
4
This could work with UDP multicast features but nanoleaf does not support such a thing.
3.5.2.4.2. Touch Stream Data
As of firmware version 1.4.0+, clients will be able to optionally register for fine resolution, low latency, touch data stream from the canvas controller over a UDP socket, if the client has also registered for Touch Events.
Not really sure what you are saying. Just wanted to say that there is already a way to get all the id’s from all panels separately just by posting a get command. When having these id’s you can do everything you want per tile. The problem is that I don’t have the knowledge to write an integration for this. So been commanding these tiles for about half a year now using rest commands. But that’s not ideal. For someone who knows how to write integrations it would be pretty easy to get all the panels in home assistant separately.
And with a script like this one i can command every tile seperately just bij giving the rest command panel number i have set above with the corresponding id’s.
So making an integration which gets out the id’s with a get post and then writes them all as seperate lights in home assistant using their id’s and controlling them with the RGB value should be a piece of cake… atleast when you know how to write an integration. With a more advanced code you can even program animations using the above rest command. But thats way over my head.
The above solution has always been possible as from the start of me using the nanoleaf tiles. We dont need an update to control them seperately. We need someone who can do the work to put the above thing in the integration
Just so all of you know, this is actually being worked on right now (not be me), the code is not in the shape to be tested by someone who is not going to be adjusting the code in realtime, but it is almost there. I am planning on testing it this afternoon if I get a chance
Notice it created 2 of the same event… that is probably a firmware thing or it “lost” my skin during swipe, either way it is working and should be part of an update in the future
depending on your version the nanoleaf library is different, and the integration itself is worked on my multiple people. What version of HA are you running and what is the problem?
Ah, I was already running 2021.10.2 when I started testing touch events, so never ran into it. The touch code actually does NOT contain the fix, but I had backported any diffs to the dependency by hand (it was small).
Current version is 2021.10.6, 2021.10.2 was out 10 days ago and did contain the fix
And yes, the touch code in dev does use events, response to HA is pretty much realtime (I did also notice the lag was up to 10s previously)