Some times it does not work. To not guess time, best solution will be, to disable Deep Sleep right after device wakes up, and reenable it, when last sensor is loaded and display is updated.
This page have only scheme, but no code.
I didn’t really understand, in which direction I should read this. Looks like “newer” and “older” on the bottom of the page leads to different articles. In them I found about MQTT and writing ESP code from scratch. Nothing about ESPhome and integration platform: homeassistant into it.
Home Automation is all about taking action based on one or multiple events and being able to remotely control a device. Is Home Assistant capturing any data that can be useful to display in a meaningful way?
estimated commute time in the morning, to see if I need to hurry up (was useful before corona … )
see in the evening what the lawn robot did (and isn’t stuck somewhere)
in the evening it just shows some pictures to keep a peace of mind
Thanks for sharing this. I have the new Lyligo 4.7 screens and I am struggling to reuse your code with them. It looks like the routines you are using for BMP display are not working with larger BMPs and such a screen. I am faced with a very simple issue that seemingly nobody solved yet - except you for a 7.5 inch display and 2 colors BMPs : how do you simply download from a given website a BMP file and display it on an e-paper screen…
Another option I have been working on is grabbing an image from any website to display on an eink screen. At the moment i am using the inkplate 6 but the concept should work with any eink screen. Its not really HA-specific but as it’s been mentioned in a few threads i have put a tutorial online at:
The inkplate 6 also support HA in the latest update, its a lovely screen, has an esp32 built in and deep sleeps, a new 10inch one is available via their crowd funding page, ships in May. Mine has been running on a battery for 14 days now, updating every 15 mins.
Thanks for sharing @digitalurban this is really useful. I have now found how to use my Lyligo 4.7 inch screen and I am essentially doing the same thing you are doing here, although a bit more complicated: i am grabbing the images I want to display here and there thanks to a RbPi, transform those images into 16 grey scale on the Pi and then display them on the Lyligo. The Lyligo library doesn’t allow for the direct display of images (neither JPG nor BMP) so I had to create a file format to feed the ESP32 with the right information. But it works.
I will look at your webpage grab script as it may become useful if I want to display something else!
Thanks again for your contribution…
how do you simply download from a given website a BMP file and display it on an e-paper screen…
I guess the downloading part is not the issue but you’re stuck somewhere in the displaying part?
@nickrout thank you for the kind words! @goosst the downloading part indeed wasn’t the issue, the 4.7 EPD uses a new kind of library, different from the waveshare screens, where the image is stored on 4bits gray level with 2 pixel aggregated per byte. That gave me a hard time to decode the BMP in the right format, and I figured since I needed a few image manipulations before displaying I’d be better off doing it on my server (RbPi) with a python script then feed the transformed file directly to the EPD - which is what I ended up doing! But your code was a great starting point and inspiration.
I followed a different approach. I have a docker running which turns any lovelace dashboard into a png. The png is loaded onto the esp/eink. Doing so everyone can easily create any lovelace which appears on the eink display.
Yes its my Landroid lawnmower I’m getting the data from the home assistant integration. More to come but I think I really got the picture right on the display for that.