Synchronizing soundflies

I finally got the composition together, and recorded a video of the thing in action.

The video starts with the fireflies calibrating themselves to a dark room. Then they start blinking randomly and find a common rythm quite quickly. When I flick the lights on, they become confused (red flashes) because they are “used” to a darker ambient light level. Finally darkness falls upon the room again and the flies synchronize, this time with a quite nice melody.

Midi out from Quartz Composer

After the episode with color tracking patches, I started working on a composition that would take video input from a webcam and detect flashes of light at certain predefined points in the picture. I wanted a different midi note to be assigned to each of the points. So that when a flash is detected the associated midi note would be sent out to a synth or sampler.

The only problem is that the midi patches that ship with QC are pretty lame, and didn’t allow me to do what I wanted. After searching around for a bit, I found the kineme project. Their midi patches are apparently quite beta, and you have to register on the site to be able to access them. I actually had to try a few different builds before I found one version where the Global Output Midi Note -patch wasn’t broken. Be sure to put the files in the right folder; the midi patches go in /Library/Graphics/Quartz Composer Patches, not -Plug-Ins.

When registered, the midi patches can be found through the link below:

Color tracking in Quartz Composer


I just built 6 of the synchronizing firelfy kits from, and when I saw them blink I immediately felt that it would be nice to hear the rythms the lights were creating.

My first idea for putting together a quick demo of this was to track the flashes with a webcam and Quartz Composer. QC doesn’t have a built in color tracking patch, so I spent some time googling for one. I found a thread on Apple’s Quartz-dev list about an example in the book GPU Gems 3.

Re: Core Image Color Tracking in GPU Gems 3

In the thread a guy asks for help on using the patches in QC, and has attached a screenshot of his composition. I found the example code, compiled it in Xcode, and duplicated the composition from the screenshot. So, he was right, the composition did not work. While I don’t have much experience with objective-c, I still decided to try deciphering the example code in order to create a working composition.

The example code comes with two Xcode projects, one which compiles a program that loops a video file and performs color tracking on it, and another that compiles an Image Unit that provides the color tracking patches. By looking trough the source for the application, I found the correct way to patch up the stuff in QC.

Apple Developer Connection: CIColorTracking

The information can be found at the very end of the AppController.m file in the CIColorTracking project. The  _renderFilterChain method sets up the patches. The names of the patches are not the same as how they appear in QC though. To find out the human readable versions, you can look them up in the  Description.strings (English) file in the TrackingImageUnit project.

I created a composition based on my findings, and to my surprise the guy in the video juggled with a duck instead of an ugly colored ball. So far so good, but what had me scratching my head for a while was how to get the coordinates for the tracked object instead of just an image superimposed on a video. Turns out that the Color Tracking Centroid Filter outputs an image which is just 1×1 pixel. The data is encoded into the rgba color channels of that single pixel. To use the position data, just connect the output to an Image Pixel patch and you get the x value from the red channel output and y from the green.


Now I’m just wondering if color tracking is the way to go at all for my demo. Maybe 6 Image Pixel patches will be enough to detect the flashes. Oh well, at least I got a little taste of Image Units, and I think I’ll have to read up a bit on that objective-c stuff.