In late 2009, I was asked to contribute to a group show called Decode being held at the prestigious Victoria & Albert Museum in London. Among the contributors are Aaron Koblin, Jonathan Harris and Sep Kamvar, Golan Levin, Daniel Brown, Daniel Rozin, and many others. According to the site, Decode “will show the latest developments in digital and interactive design, from small screen based graphics to large-scale installations.”
They asked me if I could make a real-time audio reactive version of Solar. I created Solar in 2007. It originally came about because I was working on a demo for a talk I was to give at UCLA. It quickly became one of my more popular works. It features a magnetism and gravity sandbox where particles react to the intensity of the audio. Creating a compelling real-time version is definitely a challenge. The original Solar render probably ran at about 0.5 to 1.0 frames per second. Getting that up to between 30 and 60 fps is not going to happen easily.
Thanks to some great advice from Andrew Bell, I was able to create a robust version that still had a nice visual density but would still run at a good clip. It involved using instances of the parent sphere and child particles instead of calculating a set of particle positions for each sphere. Since I was definitely CPU bound (the magnetic repulsion calculations can get really heavy really quickly), this turned out to be a great way to still show tens of thousands of particles but only need to do repulsion calculations on a set of a few hundred.
I did a render to some audio from an episode of Radiolab to show how it performs.