Had another Skype link up with Deo, working on a tabla VJ rig. We’ve using Tristan Jehan’s analyzer~ to split out bayan and dayan hits on the tabla to drive separate video animations. At the moment, just hand tuning thresholds for certain bark coefficients as a way of distinguishing between hits, but we also found some literature on automatic tabla stroke recognition which gives some characteristic frequency graphs, which might be useful. Next stage is perhaps look at whether we can use a machine learning system to improve the recognition process, but we don’t want to get too complex. Watch this space for patches and samples soon.
Speaking of complex, i’m quite stuck with working out how to make use of fields in COSM. I want to create icosohedral fields to contain granular particles of sound, that the visitor will then encounter as they move around. I have just worked out how to put platonic solids in the cosm window, so that’s a start, thanks to this blog entry: http://www.jahya.net/blog/?tag:Reflections, but I still don’t quite “get” fields. Maybe a spherical field will be ok – it doesn’t have to be visible after all.
A quick post to share a little bit of code I tweaked on the weekend. I’ve been experimenting with a Kinect sensor for gestural input into Max/MSP, and especially need skeleton tracking. I got OSCeleton working, but found it a little unstable, and I didn’t really like having to run X11 just to get a visual feedback display. Ferhat Sen’s OpenNI2OSC Processing patch was the next step, and by combining it with the SimpleNI Processing wrapper library, I managed to get a full skeleton tracking to OSC system in Processing, that seems pretty stable and eliminates the need for X11.
The output is fully compatible with the standard OSC output of OSCeleton (at least it worked identically with an existing Max/MSP patch I had running with OSCeleton). However, it only supports tracking 1 user at the moment, and it doesn’t support kitchen studio or quartz composer OSC modes.
Download the Processing file here: SimpleOpenNI_UserOSC Processing.pde
No doubt there’ll be updates and if this doesn’t find a home somewhere else, I guess I’ll put it up on github, but here it is for now. Props to all who’ve forged this path – looking forward to where it leads us!
[vimeo 3053920 448 336]
Daedelus played a post Laneway festival gig at Miss Libertine’s in Melbourne, Australia on 1/2/09, and after an hour and a half of monome-fueled sample mangling, had the crowd crying out for more. A maestro of MLR, replete in red tails and dress shirt, not to mention his very physical approach to performing with the monome, there was something outstanding about watching him a whole set of electronica construct right in front of you in a way that was both entertaining and understandable. Props to Daedelus for being such a gent and having time to chat after his set, and props to Brian and Kelli for designing such amazingly simple yet powerful devices!
[vimeo 3067918 448 336]
Daedelus was supported by an excellent performance by Dorian Concept – fat and quirky beats and funked up keyboard licks. Thoroughly enjoyed the flow of his set, and the very live nature of the performance. Curiously though, I found Daedelus’ performance with a monome more “transparent” than Dorian’s. Even though Dorian was using a more traditional instument, it was actually hard to tell at times what he was doing in between fast keyboard licks – and there was a real feeling of all the beats coming off the laptop on cue, with just the keys being added on top. No disrespect to Dorian’s music, and I’d love to be half the keyboard player he is – I’m more just noticing the limitations of his chosen interface.
The monome performance allowed you to see how the whole song was being constructed and manipulated – not just one part. And it was nice to see a 64 down beside the 256 being used as a mixer/effects control (including an accelerometer controlled “slooow down” effect). Daedelus’ interaction with the laptop was pretty much reduced to selecting the next set of samples, and the occasional tempo slide.
So, perhaps I’m biased, but those are my humble reflections. I left the evening inspired – more monome playing and building await!
[vimeo 3077098 448 336]
first flight of the cigarduino synth! i was given this fantastic Cuban cigar box, and my first thought was: “I have to build a synth to put in this…” so my trusty arduino board began its journey from the workbench into the Cuban enclosure. the current configuration is inspired by Beavis’ Audio’s arduino punk console design. I’ve tweaked the code a bit to support portamento between notes in the sequencer.
I had a spectrasymbol softpot looking for a home (courtesy of stribe maker josh boughey) and was delighted that it fitted in just beneath the row of momentary buttons. however, along with four pots on the top row, and two more beneath, that was one more analog input than the arduino supports natively. so, i used a 4051 multiplexer to support extra inputs, although i’ve got a bit of work to do to get the multiplexing working reliably.
currently the cigarduino is running a variant of the Beavis Audio Arduino Punk Console code, with the top row of pots are assigned to pitch, duration, portamento, and tempo. support for portamento is a twist on the straight arduino punk console, but the thing just oozed acid so much when I first heard it, that portamento became essential. the two pots beneath are as yet unassigned. likewise the softpot is unassigned. the red momentary buttons are sequencer slots, while the right black switch is on and off. the left black switch is unassigned.
lots of room for expansion and exploration – check back soon for more details