Make Art Now – Generative Art Challenge

Make Art Now – Generative Art Challenge

Download a zip of the full 60fps to see what it should really look like.

I recently joined a new group which hosts Realtime / Generative Art Challenges. The first challenge has the theme ‘Sound Reactive Composition’, and runs through November 15, 2013. The organizers provided a link to a song – Fucertc by Notuv, and challenged all comers to make something realtime that worked alongside it.

make-art-now-banner

…..

My Process

…..

I decided to use Touch Designer for the visuals, as it’s come to be my go-to suite for realtime work. I used to use Quartz Composer all the time, but needed to upgrade to a portable tower while doing shows for Deadmau5, and Apple had no reasonable offerings at the time, so I made the jump to Windows.

Touch Designer can take most any input you can throw at it, so my next step was to figure out how I wanted to interact with the music.

…..

Audio Analysis Attempt

…..

analysis

td-bar-extract

Ethno Tekh, one of the sponsors, has released a rad tool called ‘Data Racket‘, which will analyze a sound stream and send all kinds of info about it over OSC (seen above on the left). I loaded it up and ran the audio through it

but found that the song style wasn’t a good fit – there’s so much glitch and so many layered noises that the output wasn’t interesting enough for me. I turned to Sonic Visualizer, fantastic free software for analyzing audio in non-realtime, which uses the VAMP plugin system to enable so many types of analyzation – beat, bars, polyphonic pitch, onset, segmenting, etc. But the results were not better than those I got in realtime with Data Racket – while it could somewhat-reliably pull information on where each beat was in the track, its bar detection was off – perhaps because there is no major hit on the downbeat, so it tried to start bars on the snare hits.

I did wind up writing some code in Touch Designer to pull in saved beat / bar information and use it to segment the audio into loops so I could easily jump around in the song. Maybe it will be useful somewhere down the line (pictured to the right).

…..

The Manual Method & MEDS

…..

I was not happy with where I had gotten to with automatic analysis. I’d probably spent a full work day on trying all sorts of automatic magic on the audio and didn’t have much to show for it. I decided I would instead use my tried-and-true method of loading the song into Ableton Live and writing MIDI notes that match the audio. I would then take this MIDI and map it using my Music Event Description System.

I started with the drums, which are usually the most frenetic and fun items to work with for tracking. I loaded the track into Ableton Live, looped the first section I wanted to work on, and added a MIDI track with an Impulse instrument. I started by laying in the snare hits, using a MIDI snare drum to make sure things lined up – once my snare hit at the same time as the audio snare, I knew I was good and I could move on.

live-meds-pic

I used this method to track all the parts of the song I could make out. Drums are easy to break into individual sounds – melodious instruments are a bit tougher. There are some chords as well as some background pad-style sounds. I used Sonic Visualizer to attempt a pitch analysis of the chords, and it got me some notes but they were not great because of the background noise. I used them as a starting point to break the chords up into three and four-note chords. While I didn’t have the notes exactly right, the important part to me was getting the differences mapped out. If the chord sounds like it drops its root note, the data should have a lower root note. The idea here is not to exactly replicate the notes, but to approximate the feeling by tracking interval changes in the music. I wound up with several instruments and a catch-all FX category for things that happened only once in a while and were hard to describe as an instrument (like a record-rewinding sound, a grinding sound, etc).

With this data in hand, I fired up Touch Designer and spent a day working on a generic MIDI-to-MEDS visualization system, using only built-in objects. Here’s what I wound up with:

…..

Visualizing The Data

…..

That video above shows the MIDI data flowing from Ableton Live into Touch Designer, then going through a note-to-Meds-descriptor method. At this point, I captured the data into TouchDesigner so I could work on it without Ableton Live running, and closed down Live. This makes development easier, but leaving a simple MIDI In CHOP with a Switch on it allows me to quickly switch back to Ableton Live input if I want to update the data or work with a live performer.

fucertc-touch-final

At this point, I built the visualizations one by one. I started with the snare and hi-hat instruments, since they were the sounds that jump out at me first. Then I just went through my available instruments, adding on visuals to make a coherent scene – some active bits in the center, less-active items off to the side and in the background. I’d like to make a series of tutorials going in-depth on each visualization since there’s so much to talk about there. So for now, enjoy the final product, and feel free to leave me comments and ask questions!

 

 

Post your comment