E-paper displays are no good for interaction — update is ~4 seconds. So it is useful to display information, which does not change a lot. Most common type is, of course, temperature. You can update it once every 10 minutes, and yours e-paper display will not blink to often to cause you a seizure.
If you need interaction, it is better to use tablet, where display can normally update, based on your interaction.
Regarding my old question: Solution is pretty simple. ESP board need some time to connect to the Home Assistant. If it is awake for 20-30 seconds, it will have time to connect to Home Assistant, and you don’t need use anything except platform: homeassistant. No MQTT needed.
Also, it is a good practice to use this setting:
wifi:
fast_connect: true
It will connect ESP to specified network without searching for all available networks first.
Some things can be done interactively on e-paper well. Light switches and other things where the state can be set by single press controls (triggering a scene for example)
Partial refresh supporting displays are good for this…
I’d hate to see data entry on one though
Some times it does not work. To not guess time, best solution will be, to disable Deep Sleep right after device wakes up, and reenable it, when last sensor is loaded and display is updated.
This page have only scheme, but no code.
I didn’t really understand, in which direction I should read this. Looks like “newer” and “older” on the bottom of the page leads to different articles. In them I found about MQTT and writing ESP code from scratch. Nothing about ESPhome and integration platform: homeassistant into it.
Home Automation is all about taking action based on one or multiple events and being able to remotely control a device. Is Home Assistant capturing any data that can be useful to display in a meaningful way?
estimated commute time in the morning, to see if I need to hurry up (was useful before corona … )
see in the evening what the lawn robot did (and isn’t stuck somewhere)
in the evening it just shows some pictures to keep a peace of mind
Thanks for sharing this. I have the new Lyligo 4.7 screens and I am struggling to reuse your code with them. It looks like the routines you are using for BMP display are not working with larger BMPs and such a screen. I am faced with a very simple issue that seemingly nobody solved yet - except you for a 7.5 inch display and 2 colors BMPs : how do you simply download from a given website a BMP file and display it on an e-paper screen…
Another option I have been working on is grabbing an image from any website to display on an eink screen. At the moment i am using the inkplate 6 but the concept should work with any eink screen. Its not really HA-specific but as it’s been mentioned in a few threads i have put a tutorial online at:
The inkplate 6 also support HA in the latest update, its a lovely screen, has an esp32 built in and deep sleeps, a new 10inch one is available via their crowd funding page, ships in May. Mine has been running on a battery for 14 days now, updating every 15 mins.
Thanks for sharing @digitalurban this is really useful. I have now found how to use my Lyligo 4.7 inch screen and I am essentially doing the same thing you are doing here, although a bit more complicated: i am grabbing the images I want to display here and there thanks to a RbPi, transform those images into 16 grey scale on the Pi and then display them on the Lyligo. The Lyligo library doesn’t allow for the direct display of images (neither JPG nor BMP) so I had to create a file format to feed the ESP32 with the right information. But it works.
I will look at your webpage grab script as it may become useful if I want to display something else!
Thanks again for your contribution…
how do you simply download from a given website a BMP file and display it on an e-paper screen…
I guess the downloading part is not the issue but you’re stuck somewhere in the displaying part?
@nickrout thank you for the kind words! @goosst the downloading part indeed wasn’t the issue, the 4.7 EPD uses a new kind of library, different from the waveshare screens, where the image is stored on 4bits gray level with 2 pixel aggregated per byte. That gave me a hard time to decode the BMP in the right format, and I figured since I needed a few image manipulations before displaying I’d be better off doing it on my server (RbPi) with a python script then feed the transformed file directly to the EPD - which is what I ended up doing! But your code was a great starting point and inspiration.
I followed a different approach. I have a docker running which turns any lovelace dashboard into a png. The png is loaded onto the esp/eink. Doing so everyone can easily create any lovelace which appears on the eink display.