Semantic Modeling (Tags, Nested Area, Nested Group) into action

After trying multiple other main local home automation platforms, I am now much more convinced Home Assistant fits my need the best, but one feature I truly liked from the another competitor is called Semantic modeling. Their implementation was unnecessary cumbersome to be practically usable for me but concept was great.

In a nutshell, this is a way to OPTIONALLY organize hundreds/thousands of entities we have. The organizations/semantic modelings are used to automatically create UI and simplify automation target.

In a big picture this needs nesting groups/areas/devices/tags.

Therefore, the concept has already been suggested by different feature requests on this forum as well:

The key here to me though is a full system that utilize such organization on UI and automation implementation for the simplicity and elegance.

In an ideal world, here is how I think the proper/user friendly semantic model can be implemented.

Nesting Group
Tags, Areas and Group, I see as a simple all same concepts just a different name. Since group is the oldest feature in Home assistant (as far as I can tell), I assume its all groups. In, fact I see device is even a group.

  • Area: Group without input/output
  • Tag: Nothing but a group
  • Device: Group that contains same origin entities

So I think missing feature here is just an ability to hierarchically organize groups.

Setup

A menu/setup screen where one can drag and drop each nodes to create a hierarchically organize grouping whether its Areas, Device type Group, it is one’s choice.

Example:
Area driven organization

  • Ground Floor
    • Living room
      • Speakers
        • Sonos Arc
        • Echo Dot
      • Lights
        • Window Side
        • Wall Side
    • Kitchen

The key here is hierarchy information is not exclusive i.e. it is Tag based. Because one may want to have device type organization instead/in addition.

Device type driven organization

  • Audio Devices
    • Sonos Speakers
      • Sonos Arc
    • Amazon Echos
      • Echo Dot
  • Lights
    • Living Room
      • Window Side
      • Wall Side

In an above case, I can drag and drop Speakers, Sonos Arc etc. to Kitchen.

One can create group (speakers), area directly on this screen. Multiple organization/semantic modeling is acceptable.

UI

There are many ways UI can be implemented using semantic modeling data.

Filter driven Lovelace Menu/Tabs/cards

When creating Tabs, cards or menu, one can optionally select semantic model node. Then all the things under will be displayed accordingly.

For example,

New TAB - I choose Ground floor as semantic context then all entities/groups/area under the living room and kitchen will be auto-poulated on the page.

Here instead if I set semantic context as Living room, then it starts from there.

Basically, the system uses hierarchy but user can decide where to start for displaying, how to show them per one’s preference. I can only imagine strong HA community will continue to create custom display styles using the Semantic modeling.

Automation Target

Here semantic model context will look at the greatest common factor under the each node and that represents the node’s target property.

Example:
Using example above, if I choose area based semantic model’s Living room > speakers as target, then volume setting, TTS, play etc can be set/manipulated and both Echo and Sonos responds.

On the other hand, if I chose living room as target, lights and speakers share only “On/off” so that’s only element one can set.

I know I am asking a drastic functionality here, but with such rapid development rate of Home Assistant, especially user friendliness of UI, I can only hope someday…