Drone Lab in Pd

I came across the Drone Lab V2 by Casper Electronics the other day. To put it simply, it’s an Atari Punk Console on steroids. While the APC uses a 556 timer chip as it’s oscillator, the Drone Lab has a 40106 hex schmitt trigger configured as four audio oscillators. On top of that it has tremolo for each of the oscillators, and a low pass filter, distortion and two band pass filters for the whole mix.

Not having the patience to wait for a kit to arrive, I set out to make something similar in Pure data. I made the patch to be used with the M-Audio X-Session Midi controller. The mapping, .pd file and a soundclip (headphones are highly recommended) are posted below.

Drone Lab (Pd)
EDIT: I created a git repository over at github. All past and future Pd patches can be found there.



Synchronizing soundflies

I finally got the composition together, and recorded a video of the thing in action.

The video starts with the fireflies calibrating themselves to a dark room. Then they start blinking randomly and find a common rythm quite quickly. When I flick the lights on, they become confused (red flashes) because they are “used” to a darker ambient light level. Finally darkness falls upon the room again and the flies synchronize, this time with a quite nice melody.

Midi out from Quartz Composer

After the episode with color tracking patches, I started working on a composition that would take video input from a webcam and detect flashes of light at certain predefined points in the picture. I wanted a different midi note to be assigned to each of the points. So that when a flash is detected the associated midi note would be sent out to a synth or sampler.

The only problem is that the midi patches that ship with QC are pretty lame, and didn’t allow me to do what I wanted. After searching around for a bit, I found the kineme project. Their midi patches are apparently quite beta, and you have to register on the site to be able to access them. I actually had to try a few different builds before I found one version where the Global Output Midi Note -patch wasn’t broken. Be sure to put the files in the right folder; the midi patches go in /Library/Graphics/Quartz Composer Patches, not -Plug-Ins.

When registered, the midi patches can be found through the link below:

Color tracking in Quartz Composer


I just built 6 of the synchronizing firelfy kits from tinkerlog.com, and when I saw them blink I immediately felt that it would be nice to hear the rythms the lights were creating.

My first idea for putting together a quick demo of this was to track the flashes with a webcam and Quartz Composer. QC doesn’t have a built in color tracking patch, so I spent some time googling for one. I found a thread on Apple’s Quartz-dev list about an example in the book GPU Gems 3.

Re: Core Image Color Tracking in GPU Gems 3

In the thread a guy asks for help on using the patches in QC, and has attached a screenshot of his composition. I found the example code, compiled it in Xcode, and duplicated the composition from the screenshot. So, he was right, the composition did not work. While I don’t have much experience with objective-c, I still decided to try deciphering the example code in order to create a working composition.

The example code comes with two Xcode projects, one which compiles a program that loops a video file and performs color tracking on it, and another that compiles an Image Unit that provides the color tracking patches. By looking trough the source for the application, I found the correct way to patch up the stuff in QC.

Apple Developer Connection: CIColorTracking

The information can be found at the very end of the AppController.m file in the CIColorTracking project. The  _renderFilterChain method sets up the patches. The names of the patches are not the same as how they appear in QC though. To find out the human readable versions, you can look them up in the  Description.strings (English) file in the TrackingImageUnit project.

I created a composition based on my findings, and to my surprise the guy in the video juggled with a duck instead of an ugly colored ball. So far so good, but what had me scratching my head for a while was how to get the coordinates for the tracked object instead of just an image superimposed on a video. Turns out that the Color Tracking Centroid Filter outputs an image which is just 1×1 pixel. The data is encoded into the rgba color channels of that single pixel. To use the position data, just connect the output to an Image Pixel patch and you get the x value from the red channel output and y from the green.


Now I’m just wondering if color tracking is the way to go at all for my demo. Maybe 6 Image Pixel patches will be enough to detect the flashes. Oh well, at least I got a little taste of Image Units, and I think I’ll have to read up a bit on that objective-c stuff.


Breadboarding the Arduinoboy

I´ve been planning to build the Arduinoboy midi interface since sometime last November. I finally got around to breadboard a version yesterday using a 4n35 optocoupler. Works like a charm! 

The original “schematic” (that uses a 6n138 optocoupler) and the arduino code is located here

I used this instructable as a basis for the midi in part

When I ordered my arduino a couple of years ago, I also bought 2 atmega8 chips that were prepared with the arduino bootloader. So, I figured that the next logical step would be to freeform (ie solder the components together without a circuitboard) a minimal arduinoboy with one of those extra uc:s. This would allow me to comfortably install the device into the battery compartment of my DMG. 

There´s some info on the minimal arduino setup here 
http://www.arduino.cc/playground/Learni … Standalone

The only thing I didn´t have at home was the crystal, so when it arrives I´ll write up an instructable or something.

The next steps from there would be to power the gameboy, midi keyboard and a small active speaker from the same rechargeable battery. Slap it all together with some tape and firewood, quit my job and play music in the street.