Tag Archives: Livecoding

Dagstuhl – Collaboration and learning through live coding

Dagstuhl seminars are week long free form meetings between different disciplines centred around computer science. The location is a specially designed complex in the German countryside, and activities include long walks in the surrounding hills, a well equipped and beautiful music room and a well stocked wine cellar.

Our seminar was called ‘Collaboration and learning through live coding’, organised by Alan Blackwell, Alex McLean, James Noble and Julian Rohrhuber and included people from the fields of Software Engineering, Computer Science Education as well as plenty of practising livecoders and multidisciplinary researchers.

IMG_20130919_104725

Discussion was wide ranging and intense at times, and the first job was to sufficiently explain what livecoding actually was – which turned out to require performances in different settings:

1. Explanatory demo style livecoding: talking through it as you do it.
2. Meeting room coffee break gigs: with a closely attentive audience.
3. The music room: relaxed evening events with beer and wine.

So Dagstuhl’s music room was immediately useful in providing a more ‘normal’ livecoding situation. It was of course more stressful than usual, knowing that you were being critically appraised in this way by world experts in related fields! However it paid off hugely as we had some wonderful interpretations from these different viewpoints.

One of the most important for me was the framing of livecoding in terms of the roots of software engineering. Robert Biddle, Professor of Human-Computer Interaction at Carleton University put it into context for us. In 1968 NATO held a ‘Software Components Conference’ in order to tackle a perceived gap in programming expertise with the Soviet Union.

Software_components_lecture_large

This conference (attended my many of the ‘big names’ of programming in later years) led to many patterns of thought that pervade the design of computers and software – a tendency for deeply hierarchical command structures in order to keep control of the arising complexity, and a distrust of more adhoc solutions or any hint of making things up as we go along. In more recent times we can see a fight against this in the rise of Agile programming methodologies, and it was interesting to look at livecoding as a part of this story too. For example it provides a way to accept and demonstrate the ‘power to think and feel’ that programming give us as humans. The big question is accessibility, in a ubiquitously computational world – how can this reach wider groups of people?

IMG_20130918_025213

Ellen Harlizius-Klück works with three different domains simultaneously – investigating the history of mathematics via weaving in ancient Greece. Her work includes livecoding, using weaving as a performance tool – demonstrating the algorithmic potential of looms and combinations of patterns. Her work exposes the hidden shared history of textiles and computation, and this made a lot of sense to me as at the lowest level the operations of computers are not singular 0’s and 1’s as is often talked about, but actually in terms of transformations of whole patterns of bits.

Mark Guzdial was examining livecoding through the lens of education, specifically teaching computer science. The fact that so many of us involved in the field are also teaching in schools – and already looking at ways of bringing livecoding into this area, is noteworthy, as is the educational potential of doing livecoding in nightclub type environments. Although here it works more on the level of showing people that humans make code, it’s not a matter of pure mathematical black boxes – that can be the ground breaking realisation for a lot of people.

IMG_20130917_095349

Something that was interesting to me was to concentrate on livecoding as a specifically musical practice (rather than also a visual one) as there are many things about perceiving the process with a different sense from your description of it that are important. Julian Rohrhuber pointed out that “you can use sound in order to hear what you are doing” – the sound is the temporal execution of the code – and can be a close representation of what the computer is actually doing. This time based approach is also part of livecoding working against the notion that producing an ‘end result’ is important, Juan A. Romero said that “if you’re livecoding, you’re not just coding the final note” – i.e. the process of coding is the artform.

IMG_20130917_092532

In terms of a school teaching situation sound is also powerful, as described by Sam Aaron, livecoder and creator of Sonic Pi. A child getting a music program to work for the first time in a classroom is immediately obvious to everyone else – as it is broadcast as sound, inspiring a bit of competition and ending up with a naturally collaborative learning experience.

It’s impossible to cover all the discussions that we had, these are just the ones I happened to get down in my notebook, but it was a great opportunity to examine what livecoding is about now in relation to other practices, where it came from and where it might go in the future.

IMG_20130918_220833

London Algorave at nnnnn

In order to get ourselves prepared for the Dagstuhl livecoding seminar (more on that later), we kicked off with a London Algorave at nnnnn, Ryan Jordan’s noise research laboratory in deepest Hackney. Slub had one of our better performances, which was recorded – watch this space.

*UPDATE*

IMG_20130914_172033

Larger components make larger sounds.

IMG_20130914_220252

Massive synth washes and brutal beats from the rock star livecoders Meta-Ex.

IMG_20130914_220827

Meta-Ex close up.

IMG_20130914_235502

Yee-King’s brand new visual acid generating machine reconfigured our minds.

Slub at the Deershed festival

Deershed is a music festival designed to accommodate families with lots of activities for children. Part of this year’s festival was a Machines Tent, including Lego robot building, Mechano constructions, 3D printing and computer games.

Slub’s daily routine in the Machines Tent started by setting up the Al Jazari gamepad livecoding installation, a couple of hours with Martyn Eggleton teaching Scratch programming on an amazing quad Raspberry Pi machine (screens/processors and keyboards all built into a welded cube).

IMG_20130720_102902

At some point we would switch to Minecraft, trying some experiments livecoding the LAN game world using Martyn’s system to access the Minecraft API using Waterbear, a visual programming language using a similar blocks approach as Scratch and Scheme Bricks.

During the afternoons Alex and I could try some music livecoding experiments. This was a great environment for playful audience participatory performances, with families continually passing through the tent I could use a dancemat to trigger synths in fluxus while Alex livecoded music designed to encourage people to jump up and down.

IMG_20130720_150827

One of the most interesting things for me was to be able to see how lots of children (who mostly didn’t know each other) collaborate and self organise themselves in a LAN game, there was quite a pattern to it with all the groups:

  1. Mess around with Minecraft as usual (make some blocks, start building a house).
  2. Find something built by someone else, destroy a few bricks.
  3. Snap out of the game to notice that the other kids are complaining.
  4. Realise that there are other people in the world – and they are sat around them!
  5. Attempt to fix the damage.

At this point other people would join in to help fix things, after which there would be some kind of understanding reached between them to respect each other’s creations. This has all really inspired me to work on Al Jazari 2 which combines a lot of these ideas.

IMG_20130720_084658

Deershed Festival, Sonic Bike Lab, Fascinate Festival

Preparations for a busy summer, new Al Jazari installation gamepads on the production line:

IMG_20130715_154549

This weekend Alex and I are off to the Deershed Festival in Yorkshire to bring slub technology to the younger generation. We’ll be livecoding algorave, teaching scratch programming on Raspberry Pis and running an Al Jazari installation in between. Then onwards to London for a Sonic Bike Lab with Kaffe Matthews where we’re going to investigate the future of sonic bike technology and theory – including possibly, bike sensor driven synthesis and on the road post-apocalyptic mesh networking.

At the end of August I’m participating in my local media arts festival – Fascinate in Falmouth, where I’ll be dispensing a dose of algorave and probably even more musical robot techno.

Visual livecoding environments: big screenshots

Some decent sized screenshots of al jazari and scheme bricks rendered with fluxus’s tiled frame dump command. This set includes some satisfyingly glitchy al jazari shots – not sure what was causing this, I initially assumed it was the orthographic projection, but the same artefacts occurred on the perspective first-person robot views, so it needs further investigation.

ajbig

ajbig2

ajbig5

ajbig6

schemebricks

schemebricks2

schemebricks3

Teaching at the Düsseldorf Institute for Music and Media

Last week I was kindly invited by Julian Rohrhuber to do a couple of talks and teach a livecoding workshop alongside Jan-Kees van Kampen at the Düsseldorf Institute for Music and Media. Jan-Kees was demoing /mode +v noise a Supercollider chat bot installation using IRC, so it was the perfect opportunity to play test the work-in-progress slubworld project, including the plutonian botzlang language. It also proved a good chance to try using a Raspberry Pi as a LAN game server.

IMG_20130620_163002

There wasn’t enough time to get deeply into botzlang, but we were able to test the text to sound code that Alex has been working on with a good sound system, and the projection of the game world that visualises what is happening based on the Naked on Pluto library installation:

world5

The Raspberry Pi was useful as a dedicated server I could set up beforehand and easily plug into the institutes wireless router. We didn’t need to worry about internet connectivity, and everyone could take part by using a browser pointed at the right IP address. With access to the “superuser” commands from the Naked on Pluto game, the participants had quite a bit of fun making objects and dressing each other up in different items, later making and programming their own bots to say things that were sonified through the speakers.

Plutonian Botzlang

Plutonian Botzlang is a new language I’m working on for a commission we’ve had from Arnolfini and Kunsthal Aarhus. The idea is to make the Naked on Pluto game bots programmable in a way that allows them to be scripted from inside the game interface, able to inspect all the objects around them and carry out actions on the world like a normal player. We can then strip the game down and make it into an online multiplayer musical livecoding installation.

Bots can be fed code line by line by talking to them, started and stopped and pinged to check their status. I toyed with the idea of making a one-line programming language with lots of semi-cryptic punctuation but opted instead for something a bit simpler and longer, but requiring line numbers.

Here is an example program that looks in the current node, or room for Bells, picks them up if found then saying their descriptions. Each time it loops it might drop the Bell and walk to a new location. This results in bots that walk around a game world playing bells.


10  for e in node.entities
20     if e.name is Bell
30        pickup e.name
40     end
50  end
60  for e in this.contents
70     say e.desc
80  end
90  if random lessthan 5
100    drop Bell
110    walk
120 end
130 goto 10

Here is a screenshot of the modified version of the game with a bot being programmed:

world4

A fluxus workshop plan

I’ve been getting some emails asking for course notes for fluxus workshops, I don’t really have anything as structured as that but I thought it would be good to document something here. I usually pretty much follow the first part of the fluxus manual pretty closely, trying to flip between visually playful parts and programming concepts. I’ve taught this to teenagers, unemployed people, masters students, professors and artists – it’s very much aimed at first time programmers. I’m also less interested in churning out fluxus users, and more motivated by using it as an introduction to algorithms and programming in general. Generally it’s good to start with an introduction to livecoding, where fluxus comes from, who uses it and what for. I’ve also started discussing the political implications of software and algorithmic literacy too.

So first things first, an introduction to a few key bindings (ctrl-f fullscreen/ctrl-w windowed), then in the console:

  1. Scheme as calculator – parentheses and nesting simple expressions.
  2. Naming values with define.
  3. Naming processes with define to make procedures.

Time to make some graphics, so switch to a workspace with ctrl-1:

  1. A new procedure to draw a cube.
  2. Calling this every frame.
  3. Mouse camera controls, move around the cube.
  4. Different built in shapes, drawing a sphere, cylinder, torus.

Then dive into changing the graphics state, so:

  1. Colours.
  2. Transforms.
  3. Textures.
  4. Multiple objects, graphics state persistent like changing a “pen colour”.
  5. Transform state is applicative (scale multiplies etc).

Then tackle recursion, in order to reduce the size of the code, and make much more complex objects possible.

  1. A row of cubes.
  2. Make it bend with small rotation.
  3. Animation with (time).

At this point they know enough to be able play with what they’ve learnt for a while, making procedural patterns and animated shapes.

After this it’s quite easy to explain how to add another call to create tree recursion, and scope state using (with-state) and it all goes fractal crazy.

This is generally enough for a 2 hour taster workshop. If there is more time, then I go into the scene graph and explain how primitives are built from points, faces and show how texture coords work etc. Also the physics system is great to show as it’s simple to get very different kinds of results.

Planet Fluxus

Fluxus now runs in a browser using WebGL. Not much is working yet – (draw-cube), basic transforms, colours and textures. I’ve also built a small site in django so people can share (or perhaps more likely, corrupt) each other’s scripts. Also much inspired by seeing a load of great live coding at the algoraves by Davide Della Casa and Guy John using livecodelab.

fluxuswebgl2

fluxuswebgl

This is a spin off from the work I did a few weeks ago on a silly Scheme to Javascript compiler. It’s still pretty silly, but in order to explain better, first we take a scheme program like this:

;; a tree
(define (render n)
    (when (not (zero? n))
        (translate (vector 0 1 0))
        (with-state
            (scale (vector 0.1 1 0.1))
            (draw-cube))
        (scale (vector 0.8 0.8 0.8))
        (with-state
            (rotate (vector 0 0 25))
            (render (- n 1)))
        (with-state
            (rotate (vector 0 0 -25))
            (render (- n 1)))))

(every-frame 
    (with-state
        (translate (vector 0 -3 0))
        (render 8)))

Then parse it straight into JSON, so lists become Javascript arrays and everything else is a string, also doing minor things like switching “-” to “_”:

[["define",["render","n"],
    ["when",["not",["zero_q","n"]],
        ["translate",["vector","0","1","0"]],
        ["with_state",
            ["scale",["vector","0.1","1","0.1"]],
            ["draw_cube"]],
        ["scale",["vector","0.8","0.8","0.8"]],
        ["with_state",
            ["rotate",["vector","0","0","25"]],
            ["render",["-","n","1"]]],
        ["with_state",
            ["rotate",["vector","0","0","-25"]],
            ["render",["-","n","1"]]]]],

["every_frame",
    ["with_state",
    ["translate",["vector","0","-3","0"]],
    ["render","8"]]]]

Next we do some syntax expansion, so functions become full lambda definitions, and custom fluxus syntax forms like (with-state) get turned into lets and begins wrapped with state (push) and (pop). These transformations are actually written in Scheme (not quite as define-macros yet), and are compiled at an earlier stage. It now starts to increase in size:

[["define","render",
    ["lambda",["n"],
        ["when",["not",["zero_q","n"]],
            ["translate",["vector","0","1","0"]],
            ["begin",
                ["push"],
                ["let",[["r",["begin",
                        ["scale",["vector","0.1","1","0.1"]],
                        ["draw_cube"]]]],
                    ["pop"],"r"]],
            ["scale",["vector","0.8","0.8","0.8"]],
            ["begin",
                ["push"],
                ["let",[["r",["begin",
                        ["rotate",["vector","0","0","25"]],
                        ["render",["-","n","1"]]]]],
                    ["pop"],"r"]],
            ["begin",
                ["push"],
                ["let",[["r",["begin",
                        ["rotate",["vector","0","0","-25"]],
                        ["render",["-","n","1"]]]]],
                ["pop"],"r"]]]]],

["every_frame_impl",
    ["lambda",[],
        [["begin",
            ["push"],
            ["let",[["r",["begin",
                    ["translate",["vector","0","-3","0"]],
                    ["render","8"]]]],
            ["pop"],"r"]]]]]

Then, finally, we convert this into a bunch of Javascript closures. It’s pretty hard to unpick what’s going on at this point, I’m sure there is quite a bit of optimisation possible, though it does seem to work quite well:

var render = function (n) {
    if (!(zero_q(n))) {
        return (function () {
            translate(vector(0,1,0));
            (function () {
                push()
                return (function (r) {
                    pop()
                    return r
                }((function () {
                    scale(vector(0.1,1,0.1))
                    return draw_cube()
                })()))})();
        scale(vector(0.8,0.8,0.8));
        (function () {
            push()
            return (function (r) {
                pop()
                return r
            }((function () {
                rotate(vector(0,0,25))
                return render((n - 1))
            })()))})()
        return (function () {
            push()
            return (function (r) {
                pop()
                return r
            }((function () {
                rotate(vector(0,0,-25))
                return render((n - 1))
            })()))})()})()}};

every_frame_impl(function () {
    return (function () {
        push()
        return (function (r) {
            pop()
            return r
        }((function () {
            translate(vector(0,-3,0))
            return render(8)
        })()))})()})

Then all that’s needed are definitions for all the fluxus 3D graphics calls – the great thing is that these are also written in Scheme, right down to the low level WebGL stuff, so the only Javascript code needed is half of the compiler (eventually this also can be replaced). I was quite surprised at how easy this is, although it is greatly helped by the similarity of the two languages.