Slub have a number of important livecoding transmissions coming up (including a performance at the Mozilla Festival!) so it’s time to work on fluxus/fluxa/scheme bricks. Here are some recording tests of a feature I’ve been wanting to use for a long time – temporal recursion.
These recordings were not changed by hand as they played, but started and left running in ‘generative audio’ mode in order to try and understand the technique. This method of sequencing is inspired by Impromptu which uses a similar idea. In fluxa it’s all based around a single new function: “in” which schedules a call to a function – which can be the current function (this is different to the existing ‘timed tasks’ in fluxus which are less precise for this kind of sequencing).
(define (tick time a)
(play (+ time 3) (sample "ga.wav" (note (pick '(40 42 45) a))))
(in time 0.22 tick (+ a 1)))
(in (time-now) 1 tick 0)
The “in” function takes the current time, the time to wait before the call, the function to call and it’s parameters. In the example above the argument “a” gets incremented each time, resulting in a sequence of notes being played. Recursion generally brings up thoughts of self similarity and fractal patterns – as in the graphical use of recursion in fluxus, but here it’s better to imagine a graph of function calls. Each function can branch to a arbitrary number of others, so limitations have to be put in place to stop the thing exploding with too many concurrent calls. What seems to happen (even with small function call graphs) is the appearance of high level time structures – state changes and shifts into different modes where different patterns of calls lock into sequence. You can hear this clearly in the second recording above which alters itself half way through.
I’ve also experimented with visualising the call graph, with limited success with this more complex example – the round nodes are functions, the boxes parameter changes and labels on the connections are the branch conditions:
(define n '(23 30))
(set-scale '(1 1 2 1))
(define (za time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.4 0.1 1 0.4 1) a))
(sample "ga.wav" (note (- (modulo (- b a) 17) (* (pick n a) 2))))) -0.5)
(when (zero? (modulo a 16)) (in time 0.3 zm a b))
(if (eq? (modulo a 14) 12)
(in time 1.0 zoom a (- b 1))
(in time (- 0.5 (* (modulo a 13) 0.03)) za (+ a 1) (+ b 2))))
(define (zm time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.1 0.4 1) a))
(sample "ga.wav" (note (+ (modulo b 5) (/ (pick n a) 2))))) 0.5)
(if (> a 12)
(in time 1.0 za b a)
(in time (pick '(1.3 1.5 0.5) a) zm (+ a 1) b)))
(define (zoom time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.1 0.2 0.3) a))
(sample "ga.wav" (note (+ (pick n a) (* 3 (modulo a (+ 1 (modulo b 5)))))))))
(if (> a 16)
(in time 0.3 zm 0 (+ b 1))
(in time (pick '(1.3 0.12) a) zoom (+ a 1) b)))
(in (time-now) 1 zoom 0 1)
Thanks to Farrows Creative we have some great photos of the livenotations performance with Alex McLean, Hester Reeve and me at the Arnolfini a few weeks ago. This was a completely unrehearsed combination of Hester Reeve’s live art and slub’s live coding. A score was made from rocks and stones, using their position and also drawing on them with brushes and water, made temporary with a heat gun. A selection of good branches from Alex’s garden provided a tripod for the camera which allowed us to project the score along with a clock time marker, my code and Alex’s emacs overlaid with a second projector for a multi layer image of what we were all doing.
I could see the output from the camera (running using Gabor’s fluxus addon code) underneath a semi transparent version of scheme bricks, and my original plan was to attempt to read the score in some symbolic way. Instead I found myself using more playful methods, dragging sections of code over particular stones – and switching to using them when Hester worked on the relevant one. Her movements also helped me break out from normal programming flow more than usual, reminding me of nearby unused bits of code and I generally took a slower more considered approach.
As I said in my previous post, this seems like an encouraging direction for livecoding to follow – given how naturally it fits with performance/live art, it seems refreshing. The impulse is to augment this kind of performance with further machine vision and tracking software, but perhaps much like slub’s preference for listening to each other over complex networking, it’s interesting to concentrate on interpretations on a more open ended manner, at least to begin with.
I was honoured to take part in the live notation unit’s event at the Arnolfini on Friday, and to perform with Alex McLean and Hester Reeve in the evening.
Live notation is a project exploring connections between Live Art and Live Coding, both art forms revolving around performance, but with very different cultures and backgrounds.
The day started with workshops. The first one by Yuen Fong Ling played with the power structures inherent in Life Drawing. We tried breaking some conventions, instead of everyone drawing the same model – one scenario involved arranging the easels in a line where one person drew the model and everyone else copied the previous person in line. This ‘drawing machine’ resulted in an intriguing pictorial version of “Chinese whispers”. The second workshop involved programming choreography live via drawing and an overhead projector, firstly with workshop leader Kate Sicchio as the dancer, and then more and more livecoders joining in until the roles were reversed.
The performances consisted of a mix of live art and livecoding, and also served to demonstrate the breadth of approaches that these art forms represent – Wrongheaded performed a spectacular livecoding invasion of religious ritual, while Kate Sicchio followed beautiful instructions she’d received a couple of hours before interpreting Nicholas Poussin’s painting ‘The Triumph of David’ using brightly coloured silks. Thor Magnusson unleashed a sub bass rumbling agent driven visual approach to livecoding with a very considered minimal performance. As an audience member, I think livecoding needs a dose of cross fertilisation with related areas, especially if they are outside of the computer music sphere – we can think more about our roles, the situation and less about the mechanics. As a performer, I’m still processing (and waiting for photos) and will write a bit more on our performance in a few days.