I have always been a huge music nerd. Add to that list also being a coding nerd. I recently learned Processing and saw the potential it had to create immersive experiences, especially for designers and artists like me.
I decided to combine my love for music and my recent Processing knowledge to create an immersive music visualiser that could help me enjoy my music more.
It's in 3D so you feel you're completely immersed in the music while moving through a starfield, and I hope to make it VR compatible in the future.
The input music is first separated into the lows, mids and highs by applying a Fourier Transform on the audio track based on their frequencies.
The color is based on is an RGB value that is equally based on the lows, mids, and high.
The color of the spheres (or planets, as I like to call them in this case) is also based on the above principle. Except it is more apparent and more intense than the background color.
The color tends towards red on bass-heavy parts, and towards blue in treble dominant parts. When it is white, it suggests all bands are at full force. The planets disappear when the track goes quiet, as all frequency bands are dormant.
The planets also enlarge and contract based on the beat of the track. You can notice planets throbbing with the beat as they come towards you.
The speed is tied to all three frequency bands, which are lows, mids, and highs. However, the weights are adjusted so that it is most affected by the high-frequency bands. When the highs are dominant, the movement through the starfield is quicker.
This project was a 2-day exploration to learn more about Processing and audio visualisation. If you have any questions, suggestions, or feedback, do let me know email at [email protected].