Programming with gamepads Introduction In recent years "live coding" has emerged as a practice of improvised musical and visual performance. Watching programming seems an unlikely thing to do whilst listening to music or watching a performance, but it's proving popular as a way of reconnecting the computer performer with their audience. Game interfaces can also provide flexible new ways of interacting with software, and the rich metaphores, techniques and hardware developed for game playing can make more exciting visual performances. Two new live coding langauges, "betablocker" and "al-jazari" have been developed to bring together these ideas as a way of making live coding more accessable and moving it away from the realms of traditional programming. Livecoding Live coding is generally understood to be writing code in front of an audience, and has partly come about as a reaction to existing laptop performance, which has a tendancy to hide the performers actions away, leading to a disconnection of performer and audience. Such a disconnection is readily apparent in contrast to performances with traditional instruments. For this reason, the projecting of screens during performances is an important aspect of livecoding. It must be stressed that, although programming code is projected, it is considered enough that the audience understand that something is being created in front of them, and relate the changes in the output with the changes of the structure of the visible code - literal understanding the code text is not important. The History of Livecoding The essence of live coding is rule writing and modification at the same time as the rules being carried out. This doesn't nessecaraly have to be realised using a computer, and some offshoots of live coding research are dealing with the possibilities of rule based choreography acted out by human participants capable of rewriting their own rules [Nick Collins, Evan...]. Some of the precedent for livecoding comes from experiments of this nature in the 60s and 70s (and probably long before that) but the first recorded live coding on computer was done by Ron Kuivila in 1985 at STEIM. Another group livecoding at this time were the Hub [reference] who programmed Forth during live performances and encouraged audience members to look at what they were doing as part of the performance. TOPLAP (see below) is always looking for early references to live coding performance, so let them know if you know of any similar work. We now have to jump nearly 20 years to find the next forms of live coding, as computer technology enabled the rise of portable laptops with CPUs powerful enough to process audio or video. The audio synthesiser software "SuperCollider" brought a high level language with sound processing capability to the fore, and gigs around 2000 by James Mc Cartney, J.Rohrhuber, Nick Collins and Fabrice Mogini started to incorporate elements of code improvisation. At the same time, London based group "slub" were introducing elements of live coding using home made terminal based networked software for collaborative music making. On the other side of the Atlantic, Ge Wang and Perry Cook were developing "ChucK", a highly malleable language and environment for "on the fly" audio and graphics programming. TOPLAP According to the official histories, at 1am on Sunday 15th February 2004, in a smokey Hamburg bar some members of this embryonic live coding community formed TOPLAP (a Temporary Organisation for the Promotion of Live Algorithm Programming) in order to promote and cement the ideas of live coding. Later that day, on a Ryanair transit bus from Hamburg to Lubeck, the TOPLAP manifesto was born: We the digitally oversigned demand: * Give us access to the performer's mind, to the whole human instrument. * Obscurantism is dangerous. Show us your screens. * Programs are instruments that can change themselves * The program is to be transcended - Language is the way. * Code should be seen as well as heard, underlying algorithms viewed as well as their visual outcome. * Live coding is not about tools. Algorithms are thoughts. Chainsaws are tools. That's why algorithms are sometimes harder to notice than chainsaws. We recognise continuums of interaction and profundity, but prefer: * Insight into algorithms * The skillful extemporisation of algorithm as an impressive display of mental dexterity * No backup (minidisc, DVD, safety net computer) We acknowledge that: * It is not necessary for a lay audience to understand the code to appreciate it, much as it is not necessary to know how to play guitar in order to appreciate watching a guitar performance. * Live coding may be accompanied by an impressive display of manual dexterity and the glorification of the typing interface. * Performance involves continuums of interaction, covering perhaps the scope of controls with respect to the parameter space of the artwork, or gestural content, particularly directness of expressive detail. Whilst the traditional haptic rate timing deviations of expressivity in instrumental music are not approximated in code, why repeat the past? No doubt the writing of code and expression of thought will develop its own nuances and customs. Performances and events closely meeting these manifesto conditions may apply for TOPLAP approval and seal. A website was born and a mailing list was founded and the fledgling community of live coders began. Now live coding had become more clearly defined with some core values. Programming as thought process One of the interesting elements to have arisen from the TOPLAP manifesto is contained within the lines: "The program is to be transcended - Language is the way." and "Live coding is not about tools. Algorithms are thoughts. Chainsaws are tools. That's why algorithms are sometimes harder to notice than chainsaws." Which interestingly, draws parallels with Abelson & Sussman's famous textbook, "The Structure and Interpretation of Computer Programming": "Programs must be written for people to read, and only incidentally for machines to execute." This highlights and often misunderstood facet of programming, that it doesn't directly concern computers at all, but is mainly used as a communication tool for the programmer to understand what they are constructing, in order to debug and reuse their work later - or for other programmers to pick up, use and understand. Programming languages also communicate something more subtle, which is a way of thinking, or a space for solving problems. Different programming languages offer different philosophies, or solutions to the problem of solving problems, with broad categories like "object orientation", "functional", "imperative" or "declarative", and many sub categories and styles making up the range of langauges availible today. This goes some way to explaing why programmers get very attached to languages they know well - as they can feel threatened being made to think about problems in a different way after the many years required to learning a langauge well. Livecoding languages are generally "high level languages". These languages closely exhibit Abelson & Sussman's observation as they are designed to be easy for a human to use, rather than a computer to run. Due to the nature of livecoding, these langauages also tend to be run in specially designed environments, or editing applications, which allow the code to be running at all times, with edits to the code being incorporated into the running program in various ways (so as not to inturrupt the flow of the program). Here are some example enviroments and the languages they use: Environment Medium Language SuperCollider Primarilly music Specially designed language based on Smalltalk/C Impromptu " " Scheme feedback.pl Music Perl ChucK Music and visuals Specially designed language "ChucK" Fluxus Primarilly visuals Scheme Pure Events Music Specially designed language As shown above, most live coding environments are neither wholly restricted to audio or visual, and this exposes an interesting feature of live coding - it tends to blur the distinctions between different media. A good musical livecoder will find it easy to transfer their ability to write music generating code to that of movements of shapes and colours. In the same way, a visual live coder will be tempted to try their hand at triggering sounds instead of visual events. In a similar way, the exact same code can be used to simultaneously control the audio and visual, leading to exciting new possibilities in integrating and controlling the two in an improvised manner during a performance. Malleablity The structure and interpretation of computer programs also contains this quote from John Locke in "An Essay Concerning Human Understanding" (1690): "The acts of the mind, wherein it exerts its power over simple ideas, are chiefly these three: 1. Combining several simple ideas into one compound one, and thus all complex ideas are made. 2. The second is bringing two ideas, whether simple or complex, together, and setting them by one another so as to take a view of them at once, without uniting them into one, by which it gets all its ideas of relations. 3. The third is separating them from all other ideas that accompany them in their real existence: this is called abstraction, and thus all its general ideas are made." One of the core properties of live coding, and really the thing which makes it dynamic, and able to keep up with the speed at which the programmer is thinking (and therefore make it performative), is that it is easy to make sweeping changes to the output. This is about the expressivity of a langauge. Programming langauges allow you to "abstract" ideas into components which are shared between different sections of the code. Changes to the core components are reflected everywhere simultanoesly, in a way amplifying the effect of the programmer in ways they can control. The simplest example of this is the use of a global variable. A variable is a way of naming a value, for instance if a size is required, which maybe 3, the programmer can in once place define 3 to the name "size". The programmer has abstracted 3 with the concept of size and given it a meaning. The size variable maybe referenced in many places, so changing it's value at the one definition later on will be reflected every where it has been used. This kind of abstraction can be applied to processes too - where sequences of operations can be named and reused in many places. Once we start layering these abstractions on top of each other, we get a description of a process which is very easy to control and change in predicable and expressive ways. This is how a live coder can keep the pace needed for an engaging performance. Fluxus The livecoding language/environments described later on are written in fluxus. This is potentially confusing, as fluxus is itself a livecoding environment in its own right. As mentioned above, fluxus is programmed in Scheme, which is a langauge dating back to 1975 when it was invented by Jerald J. Sussman and Guy L. Steel Jr. Scheme is a dialect of "Lisp" which is one of the first programming langagues to be invented, and dates back to the 1950's. Fluxus is, in essence a 3D graphics engine, similar to one you would find inside of a 3D computer game. It "extends" the langauage of Scheme with commands for describing 3 dimensional scenes to a computer's graphics card, and contains features normally found in modern computer games, such as a selection of geometry types, the ability to control surface appearance with textures or hardware shaders, and rigid body dynamics for physics simulations. Fluxus also includes functionality for audio input (for controlling animations with sound) and network IO using the popular Open Sound Control format. It is free software, availible under the GLPv2 licence for Linux and OSX. Version 0.1 was released Tuesday, August 5th 2003 at 17:29. Fluxus also comes with a livecoding interpreter, utilising PLT Scheme's mzscheme - allowing you to see the code "float" above the graphics as you create them. This makes fluxus useful as a rapid prototyping tool for learning or playing with 3D animation, or just as a fast engine for making use of modern graphics card GPUs. The main focus of fluxus's interface is for live coding though, and Scheme is a good candidate for a live coding langauge. It has a very concise and elegant style, meaning that it takes very little effort (in terms of key presses or amount of text) to come up with interesting results. (define (render) ; makes a function called "render" (scale (gh 1) (gh 3) (gh 9)) ; scale using harmonic levels from the sound input (draw-cube)) ; draw a cube - which takes on the previous scale (every-frame (render)) ; every frame call render An example fluxus script to squash and stretch a cube in time to the music. Keyboards The TOPLAP manifesto talks of "glorification of the typing interface", and clearly for a programmer, text editors and keyboards are of fundamental importance. However, keyboards and other peripheral devices commonly used by programmers are responsible for physical ailements such as RSI and Carpal tunnel syndrome. Should we really be glorifying something which can do us harm in this way - or should we be looking for something different? Visually programmed, largely mouse controlled languages such as Pure Data and MAX/MSP are valuable to people who are not so attached to their text editor of choice. Some recent developments in visual programming langauges are making them possible for live coding [desire data + Rob] but still the mouse and keyboard is the prevelant input method. The other side to this coin is that the languages we are using for live coding are often not designed for purpose, and while employing langauges designed for careful construction of software for ad hoc improvisation while being mildly intoxicated adds to the specticle of live coding - it seems we could do better. Programming games Computer game input devices provide readily accessible hardware for experimenting with, and designing programming environments which use the visual language of computer games allows live coding to appeal to wider audiences. There are some precidents for programming forming part or the entirety of a game play mechanic, I will cover two such examples here. Corewars Corewars is a game where player/programmers write programs which attempt to take control of as much of the memory of a virtual machine (the core) as possible. A virtual machine is a program which mimics another (possibly fictional or real) device. They are wholly contained within the program which "virtualises" them, and so are safe places to experiment. The player written core war programs take the form of assembly language code (one such langauge is called "Redcode"), and have the ability to write and modify themselves and other programs in the quest to copy themselves over as much memory as possible. The simplest redcode "warrior program" is called the "Imp": MOV 0, 1 This program simply copies itself to the next instruction (redcode addresses relatively, so "1" means the address after the current one), which is then run, copying itself until memory is used up. More sophisticated programs will "bomb" memory in calculated patterns, or hijack each others code for parasitic strategies. Carnage Heart Carnage Heart is a Japanese Playstation game developed in 1995. The player has to program robots for battle using a visual language akin to a flow diagram. In this way, the robots (known as "OverKill Engines") cannot be controlled once battle has been started - the player has defined the behaviour and ultimate success of the robot entirely in their program. [carnage heart pic] Gamepad live coding Taking these and other games as influence, Betablocker and Al Jazari are languages written in Scheme for fluxus which can be programmed using a game pad only - no other input device is required during performance. This in itself is an interesting restriction, and requires a specially designed language and a usable interface mechanic to allow code to be written. These projects also focus on the visual aspect of livecoding, and make more of an attempt to make the code interesting to the audience than simply a text display. Ring menus These interface elements are key to making gamepad live coding possible. They allow items to be selected from a large range of options, and also utilise "muscle memory" in remembering the direction of different selections. They only need the use of one analogue stick, which can be used both to activate and operate the menu. Gamepad shoulder buttons are also used to determine and switch between different menu types. [betablocker gamepad diagram] Betablocker Betablocker was the first attempt at a game pad programmable live coding system in fluxus. It was inspired by a discussion on the TOPLAP mailing list about virtual machines, and also visually by games such as "Mr Driller", in terms of colourful blocks interacting with each other and setting up chain reactions as a game mechanic. The program is a virtual machine running inside fluxus. It visualises the simulation of a fictional CPU with 256 bytes of memory, and allows multiple threads of execution to be run sharing the same memory. The betablocker language is very much based on the ideas in core wars, but betablocker also visualises the threads of execution. This is different to other live coding language/environments, as the process itself as well as the code and entire contents of memory is visible. Betablocker programs never crash - this simply means that while the virtual machine is running, they will never stop executing. Crashes and exceptions are important in most situations as they prevent unintended code from being run, which is usually a very good thing (preventing dangerous things from happening). Uncrashable languages are used in situations where this isn't important, usually safely as virtual machines - for applications such as genetic programming. To make this possible, instructions are able to return wrong values for invalid input (divide by zero, or referencing non-existent memory) so programs keep on running even if an error has been detected. A performance of Betablocker consists of writing instructions and data to the memory cells with a gamepad, which is used to navigate memory with the direction buttons, and select instructions with a ring menu using the analogue sticks. Instructions are included to trigger sounds, and change instruments for each thread. Small prewritten library code segments can also be loaded in and pasted over memory using more ring menus. As programs run, they can write over each other, or modify themselves. Although betablocker is determisitic (there are no calls to random functions) this generally results in a chaotic situation after a few threads of instructions have begun running. As they will never crash, these threads will run off then end of their programs if they do not loop, and run through memory executing anything they find. Al Jazari Ibn Ismail ibn al-Razzaz al-Jazari was an influential scholar and engineer who lived at the beginning of the 13th century. Along with inventing and making detailed plans for many currently used mechanical devices (the crankshaft, mechanical clocks, combination locks, segmental gears and valves to name a few) he also worked on plans for automatons and humanoid robots. The robots he designed were intended to be used for playing music at royal drinking parties. This project was inspired and named after Al Jazari for the idea of live coding robots for royal drinking parties, but also has elements of the computer game "The Sims" - the idea of instructions and states visible floating above a characters head. Also Gullibloon's "Army of darkness", a robotic installation piece, where robots chaotically explore an environment which happens to be populated by strategically positioned electric guitars. Al Jazari was also developed for audiences who may not be expecting to witness live coding. It was premiered and developed during a series of events called "Ravage me Savagely" at a pub in London's New Cross, in which laptop performances were inserted between rock and punk bands with audiences consisting of drunk art students. Playing Al Jazari is similar to Betablocker, as it uses the direction buttons to navigate a grid, this time in isometric projection as well as flat grids. Again, ring menus are used for inserting instructions. New robots are placed on the isometric grid and then programmed by writing code in their "thought bubbles". The instructions command the robot to move, turn, trigger signals or inspect their local space and conditionally execute further instructions. The musical events are generated in an indirect manner, triggers can be placed on the isometric grid, which are activated when a robot enters the trigger. Robots cannot exist in the same grid position and will block or move each other out of the way. [Al Jazari gamepad diagram] [Al Jazari instruction set] Both these programs/languages are designed to be synced with other systems for network performances, usually other live coding languages for collaborative performances as part of the "slub" live coding group. Betablocker and Al Jazari are simple languages designed for live use. Betablocker is heavily inspired by existing languages (it's more of a novel visualisation and interaction experiment) and although it's designed to look different, the act of using Al Jazari is very connected to that of programming other languages - robots can be made to follow or avoid each other, send each other messages and generally exhibit complex behaviour. These projects do stretch the definition of programming languages, and people watching Al Jazari performances don't associate it with programming - mainly due to the use of a gamepad. This has both positive and negative implications, as it can be a way of getting people interested who would't normally want to see a traditional live coding performance. The use of a gamepad allows the performer to leave the confines of the laptop screen and keyboard, and means the computer is less of an central point compared to other laptop performances. The only objects you need to interact with live are the game pad and projection. Of course, the language is really the instrument in live coding, and a powerful aspect of developing a langauge specifically for live coding is that it can be influenced in response to performance practice. There is an lot of work to do in finding the correct balance between simplicity of a langauge for effective use live and complexity to keep the possibilities open but not too confusing to allow expressive use with the presssures of live performance.