Make the led dance

Hello all.
I’d like to implement some kind of integration between a RGB light ad anykind of sound source.
Since it is just an idea, befor to try to “discover the hot water”, I would like to know if someone is aware of somekind of SOFTWARE solution.
Since I have a linux/RBP system, I was thinking to fetch the values of the music via sox and convert them to values for the lamp.
I was thinking to correspond the loudness of the musich with the brightness, and the RGB values to connect them somehow with the frequency of the music.

Does anybody thought about something like this?

Nanoleaf devices do this, they appear to use multiple methods depending on the ‘scene’, which include frequency domain mapping, overall amplitude, and beat frequency detection.

Frequency domain mapping is surprisingly easy, since this is how MP3 files are encoded, you can borrow the code from that. You split the sound into frequency ranges, measure the amplitude, convert that into a color or brightness level.

Overall amplitude scenes make most sense when you have multiple LEDs in a linear arrangement.

Thanks @richieframe for the suggestion, but I would like to implement via software, without buying extra stuff :wink:

That was not my suggestion, I was just pointing out the methods that they appear to use, and where one may start looking (mp3 encoder)

In fact, an ESP32 is fast enough to perform those operations onboard, which could then drive the LEDs

Here’s my solution to make all the lights in my house dance…

I use LedFX on a headless RPI.

I split my music from my stereo into the RPI.

I have 7 light bulbs flashed with E1.31 and I have 5 LED strips with WLed.

I use home assistant to make calls to the LedFX API to start the lightshows.

They are kick ass lightshows :sunglasses:

2 Likes

thank you , I will take a look at it :slight_smile: