we release our sensor on ebay now:
welcome to our group to talk it: Screek Workshop | Facebook
we release our sensor on ebay now:
welcome to our group to talk it: Screek Workshop | Facebook
Can anybody shed some light on this error? I’ve just received a device and when powered up it’s successfully found in the (Android) app, when selected I’m immediately taken to the firmware upgrade screen and I get this error;
Device is powered with 5V, I’ve tried several sources just in case and consistently reports a good RSSI value. I haven’t yet connected via UART, that’s a job for the morning.
I have exactly the same issue when I wanted upgrade FW.
Tried it with different approaches, but did not find a solution.
Reported to LD2450 manufacturer, hoping they reply or act on it.
I also have sent a message, lets hope we get a solution?
So what I have found from this topic and conversation with HLK. The only current way to communicate that HLK provides is via a serial link, not via Bluetooth. There is a Windows application that will “plot” the detected movement, but this requires communication via the serial port ONLY.
Whilst the LD2450 is discovered via Bluetooth with the phone app, there is no communication possible with the device.
Using https://github.com/tsunglung/esphome-ld2450
from @screek-workshop I’ve been able to get movement/distance info from my LD2450. I have used the 8 pin header on the device rather than the connector.
So if you want to experiment with the device, I’d recommend getting the full kit rather than the bare device as I did!
Also worth noting that the device gets quite warm to the touch!
I hope this helps people?
This is usually because, the firmware is already up to date.
This firmware mechanism, the response we got from hlk was this:
This update software is really quite misleading.
Hello friend, This is Stella from Shenzhen Hi-Link Electronic co.,Ltd. Glad to see you are using our products, and it’s so lucky for me to see the usage process that you share, it is really great. Thank you very much for your support and trust in our products. If you need help with any questions, please feel free to contact me.
Email: [email protected]
Whatsapp: +86 15986863266
Sorry for not noticing last time, we’re doing some experimenting and we’d like to be able to simply set some coordinates in the esphome for things like this.
Hi @screek-workshop ,
no problem. Would it be possible to make the settings inside Homeassistant? Would be easier than changing the Config each time…Or even better: Visualise in a map in Homeassistant where the User can set the Zones…
Hope to see if you can get it to work soon.
All the best!
It’s pretty much 0-configuration since hlk doesn’t have an open interface for setting it up at the moment, it’s a great idea for visualizing maps, but I guess it seems to be beyond our programming capabilities, that would require html related skills, and that kind of implementation I think might require a card support, and the lovelace card on the front page, seems to be very suitable for such a thing.That looks like rendering out data effects.
A basic use for it at the moment could be to take any of the y-axis values and determine a range of people that could be needed, as an alternative to the ld2410. It has better immunity to interference than the ld2410, can filter out energy signals like fans and critters, and it seems to have some rough understanding of the human form.
We hope to help popularize it in the open source community through inexpensive and cost-effective hardware as much as possible, so as to mobilize the power of everyone together to improve it more practical.
how precise can it measure distance? Can I use it to track swing gate angle when position close to the center of rotation?
Overall we think it’s more stable and accurate, we don’t quite understand the swing gate, sorry.
I just started experimenting with ld2450 and your esphome yaml from https://github.com/screekworkshop/screek-human-sensor/blob/82ad16d8420d6bacdba63df08bbc4633f1185a79/2a/yaml/screek-humen-sensor-2a.yaml
That yaml is working very well, thank you for that! What happens under the hood of the sensor is still a bit of a jungle unfortunately. It seems to work reasonably well with 1 target, but it can get quite messy when multiple people are in the room. Also just sitting still sometime gives quite variable location on the X axis. Y seems better. Perhaps future firmware updates will improve this.
Anyway, i’m looking forward to experimenting with this sensor more (I also have ordered your version from ebay). Maybe when I have some more free time the coming weeks I will try to create some sort of custom map/location card for the UI visualize the targets in the room.
Thanks again!
Edit:
Just now testing the sensor when placed beneath my tv. My kid is watching tv on the couch and the sensor displays this nicely, but every now and then it suddenly sees another target on the same Y distance but on a different X (about 1 to 1.5 meters more to the right). I have no idea what happens there. However, I still think it can be quite a usefull sensor to determine if a certain location is occupied by checking if any of the three targets is between certain X and Y values.
(Lastly the lambda code in the Yaml file is incorrect for the p3_distance_resolution which should be using byte 27 and 26 I think?)
For 2A’s multiplayer target jumping issue
Some users feedback that the multiplayer detection of Radar 2A will jump suddenly, and the recognition is very good when it is one, while there will be some error when it is multiple.
We talked to the development department of hilink and got this feedback:
Thanks for the feedback on yaml, we’ll check it out. Since it’s rare to have all three at the same time, our tests may be insufficient.
We released a new beta firmware today that supports a quick display of whether there is someone globally for such a scenario.
beta fm for 2a
We’re really looking forward to your custom UI visual effects display component, I think that’s where it really starts to get cool.
I’m working on some kind of UI to show the LD2450 X and Y coordinates in Home Assistant,
This is it how it looks at the moment, it’s live data from Home Assistant:
That’s pretty cool! We are looking forward to using it soon.
The distance resolution of p3 was indeed a bug, we have fixed it, thank you very much for your feedback!
Now it is:
int16_t p3_distance_resolution = (uint16_t((bytes[27] << 8) | bytes[26] ));
You can fix it in your yaml.