Custom component - avoid duplicate web request

Hello,

I’m developping a custom component, and it starts to work. But I have a few question for optimisation purpose.

In fact, my custom component have multiple binary_sensors. But currently, the with one web request, i retrieve all the datas that I’m interested to.

How can I avoid that each binary_sensor run the same request, get the same resultat, and parse it ?
I’d like to do the request one time and then update all the binary_sensor.

Is it possible ?

One way to achieve this is to introduce a central manager component that handles the connection to the web service and fetches and parses the data on a regular basis and stores it in memory in a suitable format. You pass this manager into each of your binary sensors, and when the manager updates its data you can use signals to inform each binary sensor to update its own state from the manager’s data.

That’s what I had in mind, but technically how can it be done ?
What is a manager from home assistant perspective ? It is a component type ?

There is nothing special in HA as far as I know.
If you like you could have a look at an integration I have recently built that follows the described pattern: geonetnz_volcano. The actual access to the web resource is encapsulated in a separate library, but maybe this gives you an idea.

Your manager is directly integrated into the init.py ?
I had a look and I don’t see how do you manage the “schedule” update. How often the data are retrieved ? And where is it set/defined ?

Thanks

Yes, GeonetnzVolcanoFeedEntityManager is instantiated in the async_setup_entry function, and then initialised by calling manager.async_init().

As part of the initialisation I start tracking time intervals:

        async def update(event_time):
            """Update."""
            await self.async_update()

        # Trigger updates at regular intervals.
        self._track_time_remove_callback = async_track_time_interval(
            self._hass, update, self._scan_interval
        )

By default, every 5 minutes the async_update function is called, and the update function of the GeonetnzVolcanoFeedManager from the external library is called. This had been initialised with 3 callbacks (_generate_entity, _update_entity and _remove_entity), and with each update from the external web feed compares the latest data with the previous data fetched, and makes corresponding calls to those three callbacks.

Now, for example when the external library tells me that a new entity should be created, I use async_dispatcher_send to signal the sensor platform of this integration: In its async_setup_entry function you can see how I connect to the dispatcher signals (async_dispatcher_connect), and in the internal async_add_sensor function generate a new instance of a GeonetnzVolcanoSensor.
Each new sensor then also starts listening to signals (see async_added_to_hass) for updates from the central manager.

Thanks for all the advices !

But too complicated for me (I’m a dev noob).

Still, I think I managed to get it work. I used some kind of trick.
I use a binary_sensor to do my request, and retrieve all datas. And then I update (or add) the result into hass.data[MYDOMAIN][“results”]

And then, on my sensors, I parse directly the hass.data[MYDOMAIN][“results”] instead of doing a web request.