A new version of scheme bricks is under way, planned to be tested out with slub on the Mozilla Fest Party, then taken across the Atlantic for some more livecoding action in Mexico City! New things include blocks with depth – cosmetic for the moment, but I plan to prototype some new ideas based on this, separately zoom-able code blocks, and most importantly it’s a complete rewrite into functional R5RS Scheme for portability – it should now be relatively simple to get it on android via Nomadic which uses Tinyscheme.
Using the new temporal recursion, the code produced is much less monolithic. Massively tall structures resulting from plugging together sequences during long performances were a bit of an issue before, but splitting the code into a multitude of functions (which can be shrunk and put in the “background”) seems to be a far easier way of working so far.
Slub have a number of important livecoding transmissions coming up (including a performance at the Mozilla Festival!) so it’s time to work on fluxus/fluxa/scheme bricks. Here are some recording tests of a feature I’ve been wanting to use for a long time – temporal recursion.
These recordings were not changed by hand as they played, but started and left running in ‘generative audio’ mode in order to try and understand the technique. This method of sequencing is inspired by Impromptu which uses a similar idea. In fluxa it’s all based around a single new function: “in” which schedules a call to a function – which can be the current function (this is different to the existing ‘timed tasks’ in fluxus which are less precise for this kind of sequencing).
(define (tick time a)
(play (+ time 3) (sample "ga.wav" (note (pick '(40 42 45) a))))
(in time 0.22 tick (+ a 1)))
(in (time-now) 1 tick 0)
The “in” function takes the current time, the time to wait before the call, the function to call and it’s parameters. In the example above the argument “a” gets incremented each time, resulting in a sequence of notes being played. Recursion generally brings up thoughts of self similarity and fractal patterns – as in the graphical use of recursion in fluxus, but here it’s better to imagine a graph of function calls. Each function can branch to a arbitrary number of others, so limitations have to be put in place to stop the thing exploding with too many concurrent calls. What seems to happen (even with small function call graphs) is the appearance of high level time structures – state changes and shifts into different modes where different patterns of calls lock into sequence. You can hear this clearly in the second recording above which alters itself half way through.
I’ve also experimented with visualising the call graph, with limited success with this more complex example – the round nodes are functions, the boxes parameter changes and labels on the connections are the branch conditions:
(define n '(23 30))
(set-scale '(1 1 2 1))
(define (za time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.4 0.1 1 0.4 1) a))
(sample "ga.wav" (note (- (modulo (- b a) 17) (* (pick n a) 2))))) -0.5)
(when (zero? (modulo a 16)) (in time 0.3 zm a b))
(if (eq? (modulo a 14) 12)
(in time 1.0 zoom a (- b 1))
(in time (- 0.5 (* (modulo a 13) 0.03)) za (+ a 1) (+ b 2))))
(define (zm time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.1 0.4 1) a))
(sample "ga.wav" (note (+ (modulo b 5) (/ (pick n a) 2))))) 0.5)
(if (> a 12)
(in time 1.0 za b a)
(in time (pick '(1.3 1.5 0.5) a) zm (+ a 1) b)))
(define (zoom time a b)
(play (+ time 3) (mul (mul (adsr 0 0.1 1 1) (pick '(0.1 0.2 0.3) a))
(sample "ga.wav" (note (+ (pick n a) (* 3 (modulo a (+ 1 (modulo b 5)))))))))
(if (> a 16)
(in time 0.3 zm 0 (+ b 1))
(in time (pick '(1.3 0.12) a) zoom (+ a 1) b)))
(in (time-now) 1 zoom 0 1)
Thanks to Farrows Creative we have some great photos of the livenotations performance with Alex McLean, Hester Reeve and me at the Arnolfini a few weeks ago. This was a completely unrehearsed combination of Hester Reeve’s live art and slub’s live coding. A score was made from rocks and stones, using their position and also drawing on them with brushes and water, made temporary with a heat gun. A selection of good branches from Alex’s garden provided a tripod for the camera which allowed us to project the score along with a clock time marker, my code and Alex’s emacs overlaid with a second projector for a multi layer image of what we were all doing.
I could see the output from the camera (running using Gabor’s fluxus addon code) underneath a semi transparent version of scheme bricks, and my original plan was to attempt to read the score in some symbolic way. Instead I found myself using more playful methods, dragging sections of code over particular stones – and switching to using them when Hester worked on the relevant one. Her movements also helped me break out from normal programming flow more than usual, reminding me of nearby unused bits of code and I generally took a slower more considered approach.
As I said in my previous post, this seems like an encouraging direction for livecoding to follow – given how naturally it fits with performance/live art, it seems refreshing. The impulse is to augment this kind of performance with further machine vision and tracking software, but perhaps much like slub’s preference for listening to each other over complex networking, it’s interesting to concentrate on interpretations on a more open ended manner, at least to begin with.