HLK-LD2450 Initial experiments to connect to HomeAssistant

It is unusable
HiLink need to fix firmware.
Ghosting is crazy. You can actually see the issues, like finding circles intersections of detections is causing that you appear twice in two different places, etc.

I asked them to modify firmware that way that it will deliver some raw data and I will process it outside, myself, better. No answer.

Same with LD2450, I wanted raw data to fix issues, connect multiple sensor in one big array. But it seems they are not interested.

300k+ HA users, and even more of other platforms, seems to be not interesting market for HiLink.

3 Likes

Thanks for the feedback, hopefully saved a ton of my time. Been using 1125H, and tried 2410C today and it has been good so far. Kindly let me know of any real-world feedback of these sensors (2410, 2420 etc).

Thanks again!!!

I am using 2410C and all of them are working great, the only issue is that it is based on radius, while all my rooms are square, and I have a lot of open walls :slight_smile:
I tried to create an array of multiple 2410, each one in every corner of room, then based on it I tried multiple algorithms, basically it is about finding points where distance (radius) of most of radars are intersecting.

It works, but… yeah, works.

Next I tried to do same with multiple LD2450, effect was much better, however LD2450 had a lag, and it was also pointless.
Information returned from sensor arrived after I was in room, so it does not solve my problem.
Another issue with LD2450 is that there is some magic algorithm on chip, if there are more people in area, the sensor often interpret it as one person moving very fast between places.
So as long as Hi-Link will not open firmware for it, or at least do not provide more raw and faster data, it is not what I am looking for.
They could return only cloud of points, without classification and object tracking and it would be great.

Right now I ordered Texas Instruments developer board with antenna and 60GHz radar. I will try to test it, and build my own solution based on this chip (lets see)

There is an example:
https://dev.ti.com/tirex/explore/content/radar_toolbox_1_30_01_03/source/ti/examples/People_Tracking/3D_People_Tracking/docs/3d_people_tracking_user_guide.html

1 Like

@athua

base on your map code and a little mod, we build a online guide (and code tool) for 2a: Add a map card to 2A

it’s work great.

thx:)

3 Likes

@athua

https://www.reddit.com/r/screekworkshop/comments/197294n/2a_dynamic_map_card_code_update_201401015_hide_0/

Hi, based on your work, we made some changes again and managed to hide the 0 coordinate target and refine the code generator, please try them out, some of them maybe you can use in your original code.

UPDATE:

HACKING OF THE LD2450 RADAR

From the specification I was able to obtain, it appears that Hi-Link’s radar is a close clone or a modified S5KM312CL.

I managed to get access to some of the documentation, which indicates that by setting a specific state on one of the circuit pins, this device can be switched into a mode where it sends additional data. I marked the pin in the photos. Using makeshift methods overnight, I connected to the mentioned pin and managed to read, or at least receive, FFT data: range, doppler, peak. I’m still struggling with the format of this data and the packet frames.
It’s possible that I’m mistaken and the data I’m receiving is something different. Especially since the expected baud rate is 115200, but the data resembles what I want to achieve only at a baud rate of 921600.

It’s also possible that the data format for Hi-Link is different, so I ordered RD-03D which are officially built on S5KM312CL and should be compatible with the documentation (time will tell).

ORIGINAL POST TEXT:

After an extensive search, I’ve discovered that many sensors, including HiLink and AiThinker, seem to be clones of the S5KM312CL, which itself might be a clone of another sensor – it’s a complex maze.

Up to this point, my research indicates that Iclegend offers the most extensive configuration options. Notably, their 1T1R/1T2R sensors bear a striking resemblance to those from HiLink.

A key discovery I’ve made is that setting certain pins to a high state for a few nanoseconds after startup or reboot switches the UART to raw mode, unleashing a wealth of data.

The resources available are comprehensive, including source codes, datasheets, advanced calibration/configuration tools, and exhaustive documentation. However, purchasing these tools is a major challenge. My attempts to use Iclegend’s tools on Hi-Link sensors have been unsuccessful, likely due to some customized internal configurations that prevent compatibility with the original tools.

Another obstacle is the cost and availability of Iclegend’s development board, priced steeply at $250 and seemingly impossible to purchase. Access to most tools and documentation is restricted, requiring proof of purchase.

Additionally, I’ve learned that this radar system can operate in a master/slave mode, akin to the LD2461.

In summary, there’s a wealth of information and potential here. The optimal route seems to be purchasing the 1T2R sensor from Iclegend, complete with their support, and possibly designing custom PCBs for managing raw data transmission. The less desirable option involves hacking the LD2450 and modifying it to enable raw data mode through specific pin settings.

But their tools and firmware options are crazy in possibilites.

few links:

7 Likes

Your updates to the map code sounds great.
Unfortunately my internet connection, home server and network switch were taken out by a lightning strike so I haven’t been able to test it out. Currently sourcing parts to get my Home Assistant environment back up again.

1 Like

I’m sorry to hear that, and I hope you’ll be back soon.

That’s a lot of very hardcore information, and yes, it looks like they are all a series of jobs based on a core chip. That OP looks interesting.

i was looking for the LD2461 as new “fun project” but it seems probably it’s not time yet to trust these radar devices… it’s also quite expensive (15€ on Aliexpress is a lot :D)
did i get it right?

1 Like

Good morning @SummerRain

Great box and nice work.
Is it possible for you to share your STP 3D printing plans?

thanks in advance

Sorry for the stupid question but how I could use it? Through HACS?
thanks

Just add to the yaml file:

esphome:
  name: ...
  friendly_name: ...

external_components:
  - source:
      type: git
      url: https://github.com/uncle-yura/esphome-ld2450
      ref: master
    components: [ ld2450 ]

like this

esphome:
  name: hlk-presence
  friendly_name: hlk-presence

external_components:
  - source:
      type: git
      url: https://github.com/uncle-yura/esphome-ld2450
      ref: master
    components: [ ld2450 ]

esp32:
  board: esp32dev
  framework:
    type: arduino

# Enable logging
logger:
  baud_rate: 0

api:
  encryption:
    key: "....."
ota:
  password: "........"

wifi:
  ssid: !secret wifi_ssid
  password: !secret wifi_password

captive_portal:   
web_server:
  port: 80
uart:
  id: uart_bus
  tx_pin: 
    number: GPIO17
    mode:
      input: true
      pullup: true
  rx_pin: 
    number: GPIO16
    mode:
      input: true
      pullup: true
  baud_rate: 256000
  parity: NONE
  stop_bits: 1
ld2450:
  uart_id: uart_bus
  update_interval: 1s
  invert_y: false
  invert_x: false

binary_sensor:
- platform: ld2450
  has_target:
    name: Presence
  has_moving_target:
    name: Moving Target
  has_still_target:
    name: Still Target
  presence_regions:
    - name: "Custom Presence Region 0"
      region_id: presence_region_0

text_sensor:
  - platform: ld2450
    version:
      name: "FW"
    mac_address:
      name: "MAC address"

button:
  - platform: restart
    name: "ESP Restart"
  - platform: ld2450
    factory_reset:
      name: "Factory reset"
    reboot:
      name: "Reboot"

switch:
  - platform: ld2450
    single_target:
      name: "Single target"
    bluetooth:
      name: "Bluetooth"

number:
- platform: ld2450
  rotate:
    restore_value: true
    initial_value: 0
    name: "Rotate angle"
  presence_timeout:
    name: "Presence timeout"
  presence_regions:
    - x0: 100
      y0: 100
      x1: 200
      y1: 200
      id: presence_region_0
  entry_points:
    - x: 0
      y: 0
  region_0:
    x0:
      name: R0X0
    y0:
      name: R0Y0
    x1:
      name: R0X1
    y1:
      name: R0Y1

  region_1:
    x0:
      name: R1X0
    y0:
      name: R1Y0
    x1:
      name: R1X1
    y1:
      name: R1Y1

  region_2:
    x0:
      name: R2X0
    y0:
      name: R2Y0
    x1:
      name: R2X1
    y1:
      name: R2Y1

select:
  - platform: ld2450
    baud_rate:
      name: "Baud rate"
    regions_type:
      name: "Regions type"

sensor:
- platform: ld2450
  target_count:
    name: Target count

  person_0:
    position_x:
      name: "P0X"  

    position_y:
      name: "P0Y"  

    speed:
      name: "S0"  

    resolution:
      name: "R0"  

  person_1:
    position_x:
      name: "P1X"  

    position_y:
      name: "P1Y"  

    speed:
      name: "S1"  

    resolution:
      name: "R1"  

  person_2:
    position_x:
      name: "P2X"  

    position_y:
      name: "P2Y"  

    speed:
      name: "S2"  

    resolution:
      name: "R2"

but where I can find some more instructions on how properly set and visualise the zones?

I was able to do it in Python code with the Plotly library but i lack the knowledge for the “$ex” veriables and how to fill the data properly from the sensor.

import plotly.graph_objects as go
import random

# Function to generate random positions for a person moving in a room for 5 minutes
def generate_random_positions():
    x_positions = [random.uniform(-2000, 2000) for _ in range(5)]
    y_positions = [random.uniform(2000, 4000) for _ in range(5)]
    return x_positions, y_positions

# Coordinates for the first person
x_data_osoba1, y_data_osoba1 = generate_random_positions()

# Coordinates for the second person
x_data_osoba2, y_data_osoba2 = generate_random_positions()

# Current position of the first person (last point in data for the first person)
aktualni_x_osoba1 = x_data_osoba1[-1]
aktualni_y_osoba1 = y_data_osoba1[-1]

# Current position of the second person (last point in data for the second person)
aktualni_x_osoba2 = x_data_osoba2[-1]
aktualni_y_osoba2 = y_data_osoba2[-1]

# Coordinates for the polygon area
area_x = [0, 6000 * (3**0.5) / 2, 4500, 4000, 3000, 2000, 1000, 0, -1000, -2000, -3000, -4000, -4500, -6000 * (3**0.5) / 2, 0]
area_y = [0, 6000 / 2, (6000**2 - 4500**2)**0.5, (6000**2 - 4000**2)**0.5, (6000**2 - 3000**2)**0.5, (6000**2 - 2000**2)**0.5,
          (6000**2 - 1000**2)**0.5, 6000, (6000**2 - 1000**2)**0.5, (6000**2 - 2000**2)**0.5, (6000**2 - 3000**2)**0.5,
          (6000**2 - 4000**2)**0.5, (6000**2 - 4500**2)**0.5, 6000 / 2, 0]

# Create a trace for the first person's movement
trace_osoba1 = go.Scatter(x=x_data_osoba1, y=y_data_osoba1, mode='lines+markers', marker=dict(size=10, color='blue'), line=dict(width=2, color='gray'))

# Create a point for the current position of the first person
trace_aktualni_osoba1 = go.Scatter(x=[aktualni_x_osoba1], y=[aktualni_y_osoba1], mode='markers', marker=dict(size=15, color='red'))

# Create a trace for the second person's movement
trace_osoba2 = go.Scatter(x=x_data_osoba2, y=y_data_osoba2, mode='lines+markers', marker=dict(size=10, color='green'), line=dict(width=2, color='gray'))

# Create a point for the current position of the second person
trace_aktualni_osoba2 = go.Scatter(x=[aktualni_x_osoba2], y=[aktualni_y_osoba2], mode='markers', marker=dict(size=15, color='orange'))

# Create a trace for the filled area
trace_area = go.Scatter(x=area_x, y=area_y, fill='toself', fillcolor='rgba(168, 216, 234, 0.15)', line=dict(shape='linear', width=1, dash='dot'))

# Create a layout for the graph
layout = go.Layout(title='Pohyb osob v místnosti za posledních 5 minut', xaxis=dict(title='Osa X'), yaxis=dict(title='Osa Y'))

# Create a figure object combining all traces and layout
fig = go.Figure(data=[trace_osoba1, trace_aktualni_osoba1, trace_osoba2, trace_aktualni_osoba2, trace_area], layout=layout)

# Show the graph
fig.show()

You can run this in “>>>” python console

Looks like this:

@athua Do you think, it could be possible?

just store history for each target in array, then display this history

//some pseudocode

  const tracks = [[],[],[]]
  const numberOfFramesToRemember = 10

  mqtt_client.on('update', ([targetCoords1,targetCoords2,targetCoords3]) => {
    tracks[0].push(targetCoords1)
    tracks[1].push(targetCoords2)
    tracks[2].push(targetCoords3)

    tracks[0] = tracks[0].slice(-numberOfFramesToRemember)
    tracks[1] = tracks[1].slice(-numberOfFramesToRemember)
    tracks[2] = tracks[2].slice(-numberOfFramesToRemember)
  })

and to display something like that:


  for (const track of tracks) {
    const d = track.reduceRight((p,c) => `${p} L${c}`,`M${a.at(-1)}`)
    path.setAttribute('d',d)
  }

or if you want to make it fancy:

  //some pseudocode

  for (const track of tracks) {
    const [
        [_na,[x,y]],
        ...segments
      ] = track
        .reduceRight(
        (p,c) => [
          ...p,
          [ p.at(-1)?.at(-1), c ]
        ],
        []
      )

    svg.drawCircle(x,y, radius: 10)
    const segmentsLength = segments.length
    let iter = 0

    for (const [[x1,y1],[x2,y2]] in segments) {
      svg.line(
        x1, y1,
        x2, y2,
        strokeWidth: 2,
        strokeOpacity: iter/segments.length
      )
      iter++
    }
  }

Hi have used your code but I cannot see my position in the chart below.

this is the yaml

type: custom:plotly-graph
title: Target Positions
refresh_interval: 1
hours_to_show: current_day
layout:
  height: 230
  margin:
    l: 50
    r: 20
    t: 20
    b: 40
  showlegend: true
  xaxis:
    dtick: 1000
    gridcolor: RGBA(200,200,200,0.15)
    zerolinecolor: RGBA(200,200,200,0.15)
    type: number
    fixedrange: true
    range:
      - 4000
      - -4000
  yaxis:
    dtick: 1000
    gridcolor: RGBA(200,200,200,0.15)
    zerolinecolor: RGBA(200,200,200,0.15)
    scaleanchor: x
    scaleratio: 1
    fixedrange: true
    range:
      - 7500
      - 0
entities:
  - entity: ''
    name: Target1
    marker:
      size: 12
    line:
      shape: spline
      width: 5
    x:
      - $ex hass.states["sensor.hlk_presence_p0x"].state
    'y':
      - $ex hass.states["sensor.hlk_presence_p0y"].state
  - entity: ''
    name: Target2
    marker:
      size: 12
    line:
      shape: spline
      width: 5
    x:
      - $ex hass.states["sensor.hlk_presence_p1x"].state
    'y':
      - $ex hass.states["sensor.hlk_presence_p1y"].state
  - entity: ''
    name: Target3
    marker:
      size: 12
    line:
      shape: spline
      width: 5
    x:
      - $ex hass.states["sensor.hlk_presence_p2x"].state
    'y':
      - $ex hass.states["sensor.hlk_presence_p2y"].state
  - entity: ''
    name: Zone1
    mode: lines
    fill: toself
    fillcolor: RGBA(20,200,0,0.06)
    line:
      color: RGBA(20,200,0,0.2)
      shape: line
      width: 2
    x:
      - $ex hass.states["number.hlk_presence_r0x0"].state
      - $ex hass.states["number.hlk_presence_r0x0"].state
      - $ex hass.states["number.hlk_presence_r0x1"].state
      - $ex hass.states["number.hlk_presence_r0x1"].state
      - $ex hass.states["number.hlk_presence_r0x0"].state
    'y':
      - $ex hass.states["number.hlk_presence_r0y0"].state
      - $ex hass.states["number.hlk_presence_r0y0"].state
      - $ex hass.states["number.hlk_presence_r0y1"].state
      - $ex hass.states["number.hlk_presence_r0y1"].state
      - $ex hass.states["number.hlk_presence_r0y0"].state
  - entity: ''
    name: Zone2
    mode: lines
    fill: toself
    fillcolor: RGBA(200,0,255,0.06)
    line:
      color: RGBA(200,0,255,0.2)
      shape: line
      width: 2
    x:
      - $ex hass.states["number.hlk_presence_r1x0"].state
      - $ex hass.states["number.hlk_presence_r1x0"].state
      - $ex hass.states["number.hlk_presence_r1x1"].state
      - $ex hass.states["number.hlk_presence_r1x1"].state
      - $ex hass.states["number.hlk_presence_r1x0"].state
    'y':
      - $ex hass.states["number.hlk_presence_r1y0"].state
      - $ex hass.states["number.hlk_presence_r1y0"].state
      - $ex hass.states["number.hlk_presence_r1y1"].state
      - $ex hass.states["number.hlk_presence_r1y1"].state
      - $ex hass.states["number.hlk_presence_r1y0"].state
  - entity: ''
    name: Zone3
    mode: lines
    fill: toself
    fillcolor: RGBA(200,120,55,0.06)
    line:
      color: RGBA(200,120,55,0.2)
      shape: line
      width: 2
    x:
      - $ex hass.states["number.hlk_presence_r2x0"].state
      - $ex hass.states["number.hlk_presence_r2x0"].state
      - $ex hass.states["number.hlk_presence_r2x1"].state
      - $ex hass.states["number.hlk_presence_r2x1"].state
      - $ex hass.states["number.hlk_presence_r2x0"].state
    'y':
      - $ex hass.states["number.hlk_presence_r2y0"].state
      - $ex hass.states["number.hlk_presence_r2y0"].state
      - $ex hass.states["number.hlk_presence_r2y1"].state
      - $ex hass.states["number.hlk_presence_r2y1"].state
      - $ex hass.states["number.hlk_presence_r2y0"].state
  - entity: ''
    name: Coverage
    mode: lines
    fill: tonexty
    fillcolor: rgba(168, 216, 234, 0.15)
    line:
      shape: line
      width: 1
      dash: dot
    x:
      - 0
      - $ex 7500 * Math.sin((2 * Math.PI)/360 * 60)
      - 4500
      - 4000
      - 3000
      - 2000
      - 1000
      - 0
      - -1000
      - -2000
      - -3000
      - -4000
      - -4500
      - $ex -7500 * Math.sin((2 * Math.PI)/360 * 60)
      - 0
    'y':
      - 0
      - $ex 7500 * Math.cos((2 * Math.PI)/360 * 60)
      - $ex Math.sqrt( 7500**2 - 4500**2 )
      - $ex Math.sqrt( 7500**2 - 4000**2 )
      - $ex Math.sqrt( 7500**2 - 3000**2 )
      - $ex Math.sqrt( 7500**2 - 2000**2 )
      - $ex Math.sqrt( 7500**2 - 1000**2 )
      - 7500
      - $ex Math.sqrt( 7500**2 - 1000**2 )
      - $ex Math.sqrt( 7500**2 - 2000**2 )
      - $ex Math.sqrt( 7500**2 - 3000**2 )
      - $ex Math.sqrt( 7500**2 - 4000**2 )
      - $ex Math.sqrt( 7500**2 - 4500**2 )
      - $ex 7500 * Math.cos((2 * Math.PI)/360 * 60)
      - 0
raw_plotly_config: true

what was wrong?

Homeassistant already has the “history”.
If you create a 3d graph with x:y:time, where time is from -10m to now,
then if you take a look from “top” perspective, it will look the same as the code i shared.

1 Like

Can you share files for 3D printing?

1 Like

I believe I have understood the reason but I don’t know how to fix it.
The LD2450 is providing data in cm while the Ploty Graph has been designed in mm and therefore all movements are too closed to the zero.

What is the easy way to change the Ploty YAML?

Solved with sensor_templates.yaml