I just bought 2 pieces of black/white/red e-ink Bluetooth LE displays from Gicisky, intended as ESL (electronic shelf labels) for stores to display price information. There are several solutions for using such ESL displays for your own purposes, which all center around the OpenEPaperLink project. However, almost no other ESL uses Bluetooth, so they all need a special AP, typically built on an ESP32.
The Gicisky labels are different. They use BLE, and has a very simple protocol. Aaron Christophel (atc1441) has done a tremendous job reverse-engineering this protocol, and offers a simple site that can be used to update the display: ATC1441 BLE E-Paper Uploader
I would like to use my Gicisky displays directly from Home Assistant, without going throught the ESP32/OpenEPaperLink route. Since I have bluetooth support on my HA host, this should be doable.
In principle, this is not difficult: The protocol, as implemented in javascript by atc1441, needs to be rewritten in Python, and added as a custom component. It just needs doing. But before I start hacking on this, I just wanted to check if anyone already has done this?
I am in the same situation - I have the 2.9" GICISKY here and hoped to bring it into HA directly (with going via OpenEPaperLink) - but did not get further.
The GICISKY sales support I asked only replied with “it is just a display function” and “it cannot support” [receiving data directly from HA].
I saw your comment below the YouTube video about “fiddling with the screen size” to make it work with the browser updater. Which parameters did you land on? I seem to get “broken” images only.
I also have a 2.9" version, with BWR. What I did to finally get it to work was to select “296x128 BWR” in the drop-down menu. This will give you raw type “0032”. (I don’t know what the “Decode” button does; it seems ineffective.)
I then made sure that width was 128 (don’t recall if this was done automatically) and height 296. Press “Create Canvas” to get a new fresh image of that size. Turn on “Second color” (to see demo of red color), and turn off “Compression” (important! That turns on automatically when changing format, but seems to not work properly, at least for me). Turn off “Mirror” should it be on.
Enter some text, press “Enter text”. I also press “Get pixel data” but I’m not sure it’s needed.
Now you should be able to press first “Connect”, selecting your ESL, and then press “Upload image”.
And, with some luck, you should see an unscrambled image! Note that black and red are reversed when compared to the image on the web page.
The raw type is apparently something you should be able to get from the “advertising data”, and that likely is trying to say what capabilities your display have. My guess is that the value for our (or at least my) display is wrong; if I managed to track everything I believe it is just the “compression” that is incorrect. (But weirdly the size of the canvas and the stated display size do not match).
I have been trying to figure out how to get that “advertising data”; it seems to be deeply hidden in the Bluetooth stack.
If you want to match service data with a 16 bit uuid, you will have to convert it to a 128 bit uuid first, by replacing the 3rd and 4th byte in 00000000-0000-1000-8000-00805f9b34fb with the 16 bit uuid. For example, for Switchbot sensor devices, the 16 bit uuid is 0xfd3d, the corresponding 128 bit uuid becomes 0000fd3d-0000-1000-8000-00805f9b34fb.
So if we create a component with a manifest.json` like below, I think that means that Home Assistant will automatically detect a Cigisky ESL and suggest that component:
I did the same before - except the “Get Pixel Data” and that seems to do the trick. Now the picture is (almost) as in the preview. Only difference is that red and black are switched (Top text is red, bottom text is black).
In the OpenEPaper link documentation there is a “BLE only AP” mentioned. But I did not have time to dig into that. At least we already know that it can be done with only one ESP32-S3 according to this video:
My understanding is that people have already successfully used these ESLs with OpenEPaperLink and the corresponding HA integration. If you have, or are willing to build, the corresponding ESP AP, then this is by all means a likely good path to follow.
Me, on the other hand, is a software guy that dislikes soldering irons, so I want to make it work directly from my HA box, without any additional hardware (especially of the DIY kind).
I now have a working HA development environment, but I have not succeeded in getting bluetooth access in it… It’s a bit of a journey, apparently – at this point I’m guessing it will actually require more work to just get to the point were I can write a custom bluetooth integration, than writing the actual python code to publish the image
I’m actually getting some good progress here. My development environment is up and running, and I’ve written a ConfigFlow that detects the ESL tag based on its service. So my guess was correct that this should work.
I’m currently trying to figure out how to detect the display size and capabilities from the advertisment data.
I am interested in buying one of these to use it with home assistant without needing to solder anything. Did you get it working?
I can code in python but never done anything in home assistant.
I was thinking that a bad but functional solution could be to use the code in javascript from the page and send the data to an esp32 that would handle the BLE connection. Home assistant → Javascript code → ESP32 → Display
I know this is a terrible solution, but I was overwhelmed with home assistant development documentation
I am nowhere done with a custom component, but I am making progress, just slowly. (Mostly hindered by my lack of spare time.)
And yeah, the Home Assistant development is overwhelming. I’ve spent like 99% of the time in getting like a single line of actual code in the right place in a working development environment.
Also, if you want to go the ESP route, do check out https://openepaperlink.de/. That’s a complete solution, including a HASS custom component, for doing this, and it should work fine with the Gicisky.
I’m doing this just because I’m a software guy who’s allergic to using more hardware than strictly necessary.
I am nowhere done with a custom component, but I am making progress, just slowly. (Mostly hindered by my lack of spare time.)
I know spare time is not easy to get. I appreciate your work! (Keep the thread updated if you achieve anything.
And yeah, the Home Assistant development is overwhelming. I’ve spent like 99% of the time in getting like a single line of actual code in the right place in a working development environment.
I am not the only one thinking its complicated
Also, if you want to go the ESP route, do check out https://openepaperlink.de/ . That’s a complete solution, including a HASS custom component, for doing this, and it should work fine with the Gicisky.
I saw that, but if I understood correctly, you must flash some new firmware to the label. This means opening the device and soldering (In my case that would end in a destroyed label in 90% of cases).
Thats why I am thinking of the ESP solution. Would be a frankenstein software side, but no need to open the label.
I’m actually making quite some progress. At the current state, the custom integration will detect the ESL using BLE discovery, figure out display size, other model aspects (number of colors etc), battery level (I think…), and firmware/hardware version.
I can start talking to the device and have initiated a image transfer. I understand how to transfer the actual data, but will need to stop working for tonight.
Once that is done, I have a proof-of-concept that I send images from HA to the ESL.
While in a way that is the tricky part, the component is not done after that. I still need to convert an input image to the raw format used by the ESL, and from my reverse engineering of the app I see that basically every model has their own way of encoding the image data.
And finally, to be practically useful, there needs to be some kind of way to generate an image to send, like “Write this string with this font” or whatever. Possibly/Hopefully, there are already other HA integrations that can do that. I think there is some functionality in OpenePaperLink HA integration that can be re-used, for instance. So this is the part I’m least concerned about. In the end, if the component can send over a PNG image, I consider it done for release 1.0. The PNG can, after all, be generated in other ways, so making it from inside the component is just convenience.
I don’t think that is necessary. OpenePaperLink has gotten support for Gicisky integrated. You need an AP with BLE, but if you have that, it will talk to the Gicisky in just about the same way as the javascript demo. At least that is my understanding.
That is really good news! Thanks for keeping this up!
During Black Friday I checked for the availability of BLE ePaper devices, but somehow it seems to be getting harder to order them. The prices for the 2,9" exceed the 25 EUR meanwhile and I can’t order the 7.5" or 10.2" versions at all.
Searching for Gicisky on AliExpress, I am not being shown any results.
I even wrote to Gicisky (seller) and the just told me they don’t ship to my location.
On their website they mention new WIFI based devices, but it is hard to find details on the BLE versions.
You seem to be working with various sized. Where did you get them?
I volunteer if you want to do any PoC tests.
(currently HA on Raspi 4 with external BT device; MiniPC with HA on Proxmox in preparation). But right now I only have one 2.9" red/black variant.
There is a brand “Picksmart” being thrown around in the code base etc, and they have a website at http://www.picksmart.cn/, but it’s only in Chinese and I can’t make heads or tails of it.
After reverse-engineering their app, it seems easy to support multiple sizes. Most are trivial, a few are a bit more tricky. My plan is to add the trivial ones up-front, and then add the tricky ones if someone enrolls as a volunteer test subject with such a device.