The Robot Roadmap: Integrating & Automating robots in Home Assistant

Sensors, lights, and media players are the backbone of most smart homes, but they share one major limitation: they are stationary. Robots break this barrier. They move, they react, and they navigate our physical space—so why limit a vacuum to just being a vacuum?

What started as a small DIY project involving an ESPHome-controlled robot has evolved into a comprehensive journey. From integrating mobile voice assistants to mastering autonomous navigation, I’ve gathered lessons and ideas that I want to share with the community.

The Roadmap - What we will cover
This series aims to provide a full overview of:

  • Unified Integration: Connecting both commercial vacuums and custom DIY bots to Home Assistant.
  • Precision Navigation: How to direct robots to specific coordinates or Rooms with direct local control.
  • Lovelace Mastery: Visualizing maps and controlling fleets within the UI.
  • Agentic Autonomy: Giving robots “context”—allowing them to communicate, stand out, and manage their own states.
  • Extended Utility: Using mobile robots as roaming media players, dynamic charging stations, mobile sensors and many more.
  • Architecture & Best Practices: The “Go-To” setups to ensure reliability and local control.

Community & Collaboration
This is a living project. Please share your own know-how, ask questions, or post your unique builds. Let’s push the boundaries of what “smart” means when it starts moving.

Quick index (will be maintained and updated)

Credits & Inspiration
A huge thanks to the projects and posts that made this possible:

4 Likes

The initial Post showcasing a DIY Prototype of a Robot integrated into Home Assistant using ESPHome

:robot: HomieBot – ESPHome-Powered Robot Prototype

Here’s a quick proof-of-concept demo of HomieBot, a robot prototype integrated into Home Assistant via ESPHome.

In this showcase, HomieBot (built on a motor controller + ESP8266 D1 mini) responds to Home Assistant UI controls, moving forward, backward, and rotating left/right through a cover entity and an input_number.

The current setup runs on wired power, but the long-term vision is a fully modular robot capable of autonomous room navigation, executing tasks, and collecting sensor data — all seamlessly managed from within Home Assistant.

:point_right: What do you think about this?

HomieBot_V1_PoC_720-ezgif.com-video-to-gif-converter (4)

1 Like

The Foundation: Connecting Commercial Vacuums and Unifying Coordinate Systems


The Shift to “Prosumer” Hardware

After my initial DIY attempts, I decided to take a different route. I picked up three “broken” Roborock S50s online for just €30 each. It turned out that two of them were easily fixable and are still running today, while the third serves as a spare parts donor and a “lab rat” for hardware teardowns. I named the working pair Rob and WALL-E. You’ll see WALL-E as a reference in many of the following code examples.

Why Pivot from a Pure DIY Robot?
I didn’t stop the DIY project - yet; Since the Showcase Post I upgraded it with a stable base, gyroscopes, accelerometers, and TOF sensors. However, I reached a point where I asked myself: Why reinvent the wheel? Mass-produced vacuums already have high-quality chassis and, more importantly, they solved the most difficult part: SLAM (Simultaneous Localization and Mapping). Trying to run reliable SLAM via Lidar on an ESP32 is a massive undertaking. By using existing vacuums, I could skip the hardware struggle and focus on the intelligence.

First Steps: Integration & Navigation
Initially, I used the Xiaomi Home integration and the Xiaomi Cloud Map Extractor (HACS). This worked surprisingly well, especially since the login process has become much more reliable. Combined with the Lovelace Vacuum Map Card, I hit my first milestone: easy navigation.
Using the ‘vacuum_goto’-template inside the lovelace card, I could send the vacuum to any point on the map. For those new to this card, it’s incredibly powerful:

  • It supports Xiaomi, Roborock, Valetudo, Roomba, and more.
  • You can click the map in edit mode to see and copy precise coordinates.
  • You can drag-and-drop zones for immediate cleaning.
  • And you can preconfigure and clean specific rooms with just two clicks.

Here is a snippet of my Lovelace configuration:

type: custom:xiaomi-vacuum-map-card
map_source:
  camera: image.wall_e_live_map
calibration_source:
  camera: true
entity: vacuum.wall_e
vacuum_platform: Xiaomi Miio
map_modes:
  - template: vacuum_clean_zone_predefined
    predefined_selections:
      - zones:
          - - 22088
            - 18764
            - 26035
            - 21916
        icon:
          name: mdi:desktop-classic
          x: 24127
          "y": 20218
      [...]
  - template: vacuum_clean_zone
  - template: vacuum_goto

Going Local with Valetudo
I eventually switched to Valetudo. Long-term, I want my robots to be independent of the cloud. Flashing the firmware took some trial and error, but having the vacuum run entirely via MQTT (plus using the MQTT Vacuum Camera integration) is worth it.
If you’re interested in more details about this specific setup, let me know and I can provide a deeper breakdown of how i did this.
That said, there are already several excellent guides and videos available online that explain the general process very well. Also, since I used the same vacuum model three times, my setup is quite consistent — your approach may differ depending on the brand and model of your vacuum.

The next Challenge: One Map, Many Robots and Laziness
With two robots, I didn’t want to maintain separate Points of Interest (POIs) for each. Duplicating code and coordinates is a maintenance nightmare.
The Solution: A unified coordinate system using YAML and Jinja templates.

My setup uses a homemodel.json (for room data) and a robot_offset.json (for individual robot calibration). These are synced into HA using the Folder Watcher integration, two trigger-template-sensors and an automation to handle file-updates.

The Home Model (homemodel.json)
I defined each room and its POIs inside. I included extra metadata like “adjacent rooms” and “inventory,” which will be very useful for future Assist Use Cases with LLMs.

{
    "office": {
        "id": "office",
        "name": "Office",
        "coordinates" : [-1028.11, 475.14, -1066.18, -2.58],
        "adjacent" : ["floor"],
        "inventory" : ["The printer", "The desktop PC", "WALL-E's charging station"],
        "pois": {
            "Infront of the table": [-618.64,260.8],
            "Under the table": [-720.89,348.14],
            "Charging station": [-214.03,148.52]
        }
    },
    [...]
}

How did I determine these positions?
To create a universal coordinate system that works for any robot, I followed these steps:

  1. Map Normalization: I took the map from one of my robots and normalized it so that all points fall within a range of -1000 to +1000 for each coordinate axis.
  2. Defining the Origin: I designated the center of my map as [0,0] and the top-left corner as [-1000, -1000].
  3. Configuration: Based on these reference points, I was able to build the robot_offset.json for my specific hardware.

The Logic behind Offset and Scale
By comparing the robot’s “native” coordinates with my “normalized” ones, I calculated an offset and a scale factor for both the X and Y axes:

  • The Offset: This represents the raw coordinate value the robot reports when it is physically standing at my defined center [0,0].
  • The Scale Factor: This is used to align the distances between points. It scales the coordinates to a second Point of Interest (POI) after the corresponding offset has already been subtracted.
    This mathematical bridge allows me to store a POI once (e.g., “Under the table” at [-720, 348]) and have every robot translate that into its own specific coordinate system.
{
    "wall_e": {
        "offset_x": 2722,
        "offset_y": 2301,
        "scale_x": 0.47,
        "scale_y": 0.665
    },
    [...]
}

Keeping Data in Sync
I use trigger-based template sensors to load the shown JSON data. To avoid manual updates, a Folder Watcher automation detects when I save the JSON files and fires an event to refresh the sensors.
Since I’m admittedly a bit lazy and didn’t want to update the template sensor every single time I added a new attribute, I decided to just create one attribute that contains the entire JSON payload.
It keeps things flexible and saves me from constantly tweaking the config.
Of course, if you prefer a cleaner or more granular setup, you can absolutely split it into separate sensors — for example one per room or per vacuum — depending on how structured you want your entities to be.

- triggers:
    - platform: event
      event_type: homemodel_updated
  sensor:
    - name: "Homemodel"
      unique_id: homemodel
      state: "{{ now().isoformat() }}"
      attributes:
        data: "{{ trigger.event.data.json | tojson | default({}) }}"
        
- triggers:
    - platform: event
      event_type: robot_offsets_updated
  sensor:
    - name: "Robot Offsets"
      unique_id: robot_offsets
      state: "{{ now().isoformat() }}"
      attributes:
        data: "{{ trigger.event.data.json | tojson | default({}) }}"

Using the Folder Watcher entity, I created a short automation on each file change that parses the JSON and updates the matching entities’ attributes.

alias: Handling json file updates
triggers:
  - trigger: state
    entity_id:
      - event.folder_watcher_config_homeplan --> the just created Folder watcher entity
actions:
  - variables:
      file_path: "{{ trigger.to_state.attributes.path }}" --> the path for the json files 
      event_type: "{{ trigger.to_state.attributes.file | replace('.json', '_updated') }}" --> my default setup of the event name "filename + '_updated'"
  - action: file.read_file
    data:
      file_name: "{{ file_path }}"
      file_encoding: JSON
    response_variable: json_file
  - choose:   --> because you cannot setup template variables within event triggers at the moment I used a choose statement for each event
      - conditions:
          - condition: template
            value_template: "{{ event_type == 'homemodel_updated'}}"
        sequence:
          - event: homemodel_updated  --> must be static
            event_data:
              json: "{{ json_file.data | tojson }}"
      - conditions:
          - condition: template
            value_template: "{{ event_type == 'robot_offsets_updated'}}"
        sequence:
          - event: robot_offsets_updated  --> must be static
            event_data:
              json: "{{ json_file.data | tojson }}"
    default:
      [...] --> sent me a notification if there is a new eventtype I didn't setup yet
mode: queued
max: 10

And thats it!
I can now send any robot to any POI using a single universal coordinate. The math is handled on the fly:

  x_coordinate: >-
    {{ state_attr('sensor.homemodel', 'data')['office']['pois']['my_poi_i_want_to_go'][0] | float }}
  y_coordinate: >-
    {{ state_attr('sensor.homemodel', 'data')['office']['pois']['my_poi_i_want_to_go'][1] | float }}
  wall_e_x_coordinate: >-
    {{ x_coordinate * state_attr('sensor.robot_offsets',
    'data')['wall_e']['scale_x'] | float + state_attr('sensor.robot_offsets',
    'data')['wall_e']['offset_x'] | float }}
  wall_e_y_coordinate: >-
    {{ y_coordinate * state_attr('sensor.robot_offsets',
    'data')['wall_e']['scale_y'] | float + state_attr('sensor.robot_offsets',
    'data')['wall_e']['offset_y'] | float }}

(Note: To use with the variables code block inside an automation or script)

Next Goal: Giving WALL-E a voice. He’s ready to talk!