Hi.
I have a single power supply 600W, 24V.
I have two LED strips connected to this in parallel.
LED strip A (living room) takes 75 W when fully on and strip B (iota) fully off:
B fully on when A fully off draws 53 W:
When I switch both, the voltage (and therefore brightness of the both strips) is significantly reduced (100 W):
Is this behavior normal? I tried two different power supplies, and the result is the same. It’s annoying that someone in bed room setting the brightness of their strip influences living room strips. The power supply should have plenty of power to supply these.
To provide more on my setup:
-
the living room ramp (A) has 4 white strips (in parallel)
-
iota ramp (B) has 3 white strips connected (in parallel)
-
they are both controlled by Shelly RGBW2 (but the behavior is AFAIK same even without these controllers being there). Shelly RGBW2 should have plenty of throughput for my usecase, that is 45W/channel, 280W/total.
-
all the strips have the same wattage. It should be 10W/m, but what I observe is that when longer than that, they draw less wattage (to my regret - I believe it makes them less bright?). Specifically, a single channel in ramp B (iota) should draw about 35 W (because it has 3.5m), but it draws either 15 W or 22 W (another inexplicable behavior, see the screenshots below).
-
I observe the same on a single controller as well (which makes sense given the previous), i.e. having just one channel on makes the strip brighter than when turning on three channels, but maybe slightly less so:
single channel: 15 W:
single channel, sometimes showing 22W:
three channels showing 40W (instead of 3x15 ~ 45W, or 60W)
Is any of this normal? Do power supplies simply behave like this normally and there is nothing I can do about this? Is it the normal LED strip behavior? Can I do anything about this? How can e.g. some infrastructure or home installation work? Thanks!