# [help] Circuit to choose between two power supplies

I’m a power systems guy who usually don’t like electronic circuits, but right now only them can save me and I’m pretty sure that someone else had to solve this puzzle while putting it’s own system to work.

I’m finishing my new place where I’ll be using a lot (to me) of 12V and 24V power supplies to power lighting. I’m using some zigbee drivers (Gledopto
GL-C-202P) and to make it usable thru Home Assistant I’ve to let their power supplies powered on 24/7 and that brings me two problems due to the time they will be on even thought the light is off (95% of the time it will be draining some power to power it’s own circuit and the driver) that are higher bills and smaller lifetime to the PSUs.
I’ve thinking about it for quite some time and yet to understand how can I improve it. For the last days my idea was to use some kind of circuit that, based on the current drained by the driver, it switches between a bigger and a smaller DC source. If the current is under some threshold it’s powering only the driver and maybe a much dimmed strip, being possible to power it with only an AC DC power module of 5W or less (Hi-link HLK-5M12 or similar). But if it’s draining more power it switch to a bigger one, with enough power to use the lights in full power.
That’s the best way to do that? There is some kind of circuitry that I could use? Or this is just a really dumb idea?

With your idea, every time you swap PSU it would also boot your controller. Also you need current sensor and microcontroller to keep on eye that sensor and trigger your relays.

It would be better to try to power that zigbee controller circuit separately from led power supply. You need to mod that circuit, if possible

But first thing to do is measure how much more that bigger PSU is really consuming when leds are off.

Maybe you’re proposing multiple-sized PSUs because of the efficiency drop at smaller outputs? My understanding was US and EU devices at idle should draw under 1w, so even a lousy 50% efficiency at idle only wastes ~4kWh per year — hardly worth saving. Max load will waste far more power, so it’s definitely worth looking for PSUs with >90% efficiency at rated output. Trying to engineer a complex switching mechanism and introducing additional points of failure to save a dollar a year does not seem worthwhile, unless I’m missing something?

It might improve efficiency by eliminating the Zigbee driver as a power component, and instead using a dimmable driver with 0-10v control input; this in turn requires a smart device to send the 0-10v signal, but since it’s not part of the power path it won’t steal another 5-20% of the rated watts. I know there are Z-Wave and Zigbee devices that should be pretty low power compared to WiFi equivalents (eg, shelly).

So, that’s just an dumb idea. Thank you for your response.

No, if you measure that your larger PSU’s are consuming a lot (they might) when powering only controller circuit, it’s good idea to optimize power distribution. It’s not necessarily difficult.

I’m far from an electrical expert, but (PSU inefficiencies aside) that’s not how wattage works.

A 100W PSU will only draw up to 100W when the load connected to it requires that much. If it’s in standby (leds off) and only powering the driver, it’ll draw whatever wattage the driver requires - 5W in your example.

It’s not a dumb idea. Uninformed or misguided, maybe, but not dumb. We’re all here to learn, myself included.

That would be ideal transformer, efficiency 100%. But real transformers can have really bad efficiency when load is low.
Anyway, it’s so easy to measure, that guessing here is waste of time.

I would just do a whole home UPS system at that point instead of doing per device supply switching as not only does it become a backup power for the whole home it’s more effective setup.