Synchronizing soundflies

I finally got the composition together, and recorded a video of the thing in action.

The video starts with the fireflies calibrating themselves to a dark room. Then they start blinking randomly and find a common rythm quite quickly. When I flick the lights on, they become confused (red flashes) because they are “used” to a darker ambient light level. Finally darkness falls upon the room again and the flies synchronize, this time with a quite nice melody.

Advertisements

Color tracking in Quartz Composer

quartz_color_track

I just built 6 of the synchronizing firelfy kits from tinkerlog.com, and when I saw them blink I immediately felt that it would be nice to hear the rythms the lights were creating.

My first idea for putting together a quick demo of this was to track the flashes with a webcam and Quartz Composer. QC doesn’t have a built in color tracking patch, so I spent some time googling for one. I found a thread on Apple’s Quartz-dev list about an example in the book GPU Gems 3.

Re: Core Image Color Tracking in GPU Gems 3

In the thread a guy asks for help on using the patches in QC, and has attached a screenshot of his composition. I found the example code, compiled it in Xcode, and duplicated the composition from the screenshot. So, he was right, the composition did not work. While I don’t have much experience with objective-c, I still decided to try deciphering the example code in order to create a working composition.

The example code comes with two Xcode projects, one which compiles a program that loops a video file and performs color tracking on it, and another that compiles an Image Unit that provides the color tracking patches. By looking trough the source for the application, I found the correct way to patch up the stuff in QC.

Apple Developer Connection: CIColorTracking

The information can be found at the very end of the AppController.m file in the CIColorTracking project. The  _renderFilterChain method sets up the patches. The names of the patches are not the same as how they appear in QC though. To find out the human readable versions, you can look them up in the  Description.strings (English) file in the TrackingImageUnit project.

I created a composition based on my findings, and to my surprise the guy in the video juggled with a duck instead of an ugly colored ball. So far so good, but what had me scratching my head for a while was how to get the coordinates for the tracked object instead of just an image superimposed on a video. Turns out that the Color Tracking Centroid Filter outputs an image which is just 1×1 pixel. The data is encoded into the rgba color channels of that single pixel. To use the position data, just connect the output to an Image Pixel patch and you get the x value from the red channel output and y from the green.

quartz_color_track2

Now I’m just wondering if color tracking is the way to go at all for my demo. Maybe 6 Image Pixel patches will be enough to detect the flashes. Oh well, at least I got a little taste of Image Units, and I think I’ll have to read up a bit on that objective-c stuff.