Integrating Nanoleaf Canvas Touch events into HA

I would like to enhance the nanoleaf integration to allow HA to register Touch events on the new Canvas tiles. There are two interfaces for this: One is a REST interface where events are streamed over an open GET request, and the other is a low-latency UDP interface (I am currently trying to figure out from the devs whether this interface really exists).

I have not developed an HA component previously (although I have some private branches of existing components with added functionality). What I’m looking for, if someone has advice, is the following: An already-implemented component that correctly integrates some kind of event stream into HA–preferably something simple that reflects the “correct” underlying design principles, just so I can see how various objects are registered.

Thanks!

1 Like

I’m buying a starter kit of 25 CANVAS. The delivery is today…Contact me, i’m interested too.

This would be awesome. Any luck so far?

I integrated the whole nanoleaf features in ha using node red. And read the whole api documents of nanoleaf I couldn’t find anything on touch. I don’t thinks it’s possible to listen to touch events. If this should work then the nanoleaf tiles should need to support some kind of push feature to push info over the network. This could work with UDP multicast features but nanoleaf does not support such a thing. At least I can’t find anything about it. So for now you are stuck with telling the tiles what to do. I know nanoleaf has very active devs so I wouldn’t be surprised if it will be added later on

And read the whole api documents of nanoleaf I couldn’t find anything on touch. I don’t thinks it’s possible to listen to touch events.

@sygys Nanoleaf Developers Forum

Developers can register and listen to events from Nanoleaf Devices using Server-Sent Events (SSE).

Event Type Event Type Id
State: 1
Layout 2
Effects 3
Touch: 4

This could work with UDP multicast features but nanoleaf does not support such a thing.

3.5.2.4.2. Touch Stream Data

As of firmware version 1.4.0+, clients will be able to optionally register for fine resolution, low latency, touch data stream from the canvas controller over a UDP socket, if the client has also registered for Touch Events.

Not really sure what you are saying. Just wanted to say that there is already a way to get all the id’s from all panels separately just by posting a get command. When having these id’s you can do everything you want per tile. The problem is that I don’t have the knowledge to write an integration for this. So been commanding these tiles for about half a year now using rest commands. But that’s not ideal. For someone who knows how to write integrations it would be pretty easy to get all the panels in home assistant separately.

At this moment i have this rest command:

nanoleaf:
  url: http://192.168.178.86:16021/api/v1/***token***/effects
  method: PUT
  payload: >
    { "write" : {"command": "display", "animType": "static", "animData":
    {%- set all = [ 36776, 40379, 39889, 47065, 62644, 56943, 44929, 4212, 50749, 42705, 55753, 9801, 2975, 5181, 59276, 33852, 34713, 42845 ] %}
    {%- set ns = namespace(panels=[panels | count | string]) %}
    {%- for panel in panels %}
    {%- set ns.panels = ns.panels + [ '{} 1 {} {} {} 0 20'.format(all[panel.number-1], panel.r, panel.g, panel.b) ] %}
    {%- endfor %}
    "{{ ns.panels | join(' ') }}",
    "loop": false, "palette": [], "colorType": "HSB"}
    }
  content_type: 'application/json'

And with a script like this one i can command every tile seperately just bij giving the rest command panel number i have set above with the corresponding id’s.

nanoleaf_deurbel:
  sequence:
  - repeat:
      count: 2
      sequence:
      - service: rest_command.nanoleaf
        data:
          panels:
          - number: 1
            r: 0
            g: 150
            b: 0
          - number: 2
            r: 0
            g: 150
            b: 0
          - number: 3
            r: 0
            g: 150
            b: 0
          - number: 4
            r: 0
            g: 150
            b: 0
          - number: 5
            r: 0
            g: 150
            b: 0
          - number: 6
            r: 0
            g: 150
            b: 0
          - number: 7
            r: 0
            g: 150
            b: 0
          - number: 8
            r: 0
            g: 150
            b: 0
          - number: 9
            r: 0
            g: 150
            b: 0
          - number: 10
            r: 0
            g: 150
            b: 0
          - number: 11
            r: 0
            g: 150
            b: 0
          - number: 12
            r: 0
            g: 150
            b: 0
          - number: 13
            r: 0
            g: 150
            b: 0
          - number: 14
            r: 0
            g: 150
            b: 0
          - number: 15
            r: 0
            g: 150
            b: 0
          - number: 16
            r: 0
            g: 150
            b: 0
          - number: 17
            r: 0
            g: 150
            b: 0
          - number: 18
            r: 0
            g: 150
            b: 0

So making an integration which gets out the id’s with a get post and then writes them all as seperate lights in home assistant using their id’s and controlling them with the RGB value should be a piece of cake… atleast when you know how to write an integration. With a more advanced code you can even program animations using the above rest command. But thats way over my head.

The above solution has always been possible as from the start of me using the nanoleaf tiles. We dont need an update to control them seperately. We need someone who can do the work to put the above thing in the integration

@sygys I’m posting about the ability to get touch events (but not limited to) from nanoleaf. But it requires some coding.

I never used touch on the nanoleaf tiles. Everything goes automatic here in our home. I barely touch anything.

Just so all of you know, this is actually being worked on right now (not be me), the code is not in the shape to be tested by someone who is not going to be adjusting the code in realtime, but it is almost there. I am planning on testing it this afternoon if I get a chance

Testing is going well, I can see the events for swipes and taps, as well as the individual tile touch states

Event 46 fired 7:02 PM:

{
    "event_type": "nanoleaf_event",
    "data": {
        "device_id": "S20280H0812",
        "type": "touch",
        "panel_id": null,
        "gesture": "Swipe Down",
        "swipe_to_panel_id": null
    },
    "origin": "LOCAL",
    "time_fired": "2021-10-18T00:02:58.739135+00:00",
    "context": {
        "id": "7347eefdce4731d21b38e671a8ef1a4c",
        "parent_id": null,
        "user_id": null
    }
}
Event 38 fired 7:02 PM:

{
    "event_type": "nanoleaf_event",
    "data": {
        "device_id": "S20280H0812",
        "type": "touch",
        "panel_id": 14957,
        "gesture": "Single Tap",
        "swipe_to_panel_id": null
    },
    "origin": "LOCAL",
    "time_fired": "2021-10-18T00:02:45.773300+00:00",
    "context": {
        "id": "e4c217f97228d9fa597404ff249769ec",
        "parent_id": null,
        "user_id": null
    }
}

And it is pretty easy to parse the event trigger to perform actions, example for this test was to create a persistent notification

Notice it created 2 of the same event… that is probably a firmware thing or it “lost” my skin during swipe, either way it is working and should be part of an update in the future

@richieframe By who? I have a bit problem with current nanoleaf integration.

depending on your version the nanoleaf library is different, and the integration itself is worked on my multiple people. What version of HA are you running and what is the problem?

@richieframe latest HA (2021.10.00), Problem is https://github.com/home-assistant/core/issues/57943, Author responded it should be fixed in next release. Waiting for 2021.10.02. I also asked him about plans to support event streams (https://github.com/milanmeu/aionanoleaf/issues/1) but got no response yet.

Ah, I was already running 2021.10.2 when I started testing touch events, so never ran into it. The touch code actually does NOT contain the fix, but I had backported any diffs to the dependency by hand (it was small).

Current version is 2021.10.6, 2021.10.2 was out 10 days ago and did contain the fix

And yes, the touch code in dev does use events, response to HA is pretty much realtime (I did also notice the lag was up to 10s previously)