Hello all.
I’d like to implement some kind of integration between a RGB light ad anykind of sound source.
Since it is just an idea, befor to try to “discover the hot water”, I would like to know if someone is aware of somekind of SOFTWARE solution.
Since I have a linux/RBP system, I was thinking to fetch the values of the music via sox and convert them to values for the lamp.
I was thinking to correspond the loudness of the musich with the brightness, and the RGB values to connect them somehow with the frequency of the music.
Nanoleaf devices do this, they appear to use multiple methods depending on the ‘scene’, which include frequency domain mapping, overall amplitude, and beat frequency detection.
Frequency domain mapping is surprisingly easy, since this is how MP3 files are encoded, you can borrow the code from that. You split the sound into frequency ranges, measure the amplitude, convert that into a color or brightness level.
Overall amplitude scenes make most sense when you have multiple LEDs in a linear arrangement.