This weekend Alex and I are off to the Deershed Festival in Yorkshire to bring slub technology to the younger generation. We’ll be livecoding algorave, teaching scratch programming on Raspberry Pis and running an Al Jazari installation in between. Then onwards to London for a Sonic Bike Lab with Kaffe Matthews where we’re going to investigate the future of sonic bike technology and theory – including possibly, bike sensor driven synthesis and on the road post-apocalyptic mesh networking.
At the end of August I’m participating in my local media arts festival – Fascinate in Falmouth, where I’ll be dispensing a dose of algorave and probably even more musical robot techno.
Plutonian Botzlang is a new language I’m working on for a commission we’ve had from Arnolfini and Kunsthal Aarhus. The idea is to make the Naked on Pluto game bots programmable in a way that allows them to be scripted from inside the game interface, able to inspect all the objects around them and carry out actions on the world like a normal player. We can then strip the game down and make it into an online multiplayer musical livecoding installation.
Bots can be fed code line by line by talking to them, started and stopped and pinged to check their status. I toyed with the idea of making a one-line programming language with lots of semi-cryptic punctuation but opted instead for something a bit simpler and longer, but requiring line numbers.
Here is an example program that looks in the current node, or room for Bells, picks them up if found then saying their descriptions. Each time it loops it might drop the Bell and walk to a new location. This results in bots that walk around a game world playing bells.
10 for e in node.entities
20 if e.name is Bell
30 pickup e.name
60 for e in this.contents
70 say e.desc
90 if random lessthan 5
100 drop Bell
130 goto 10
Here is a screenshot of the modified version of the game with a bot being programmed:
I’ve been doing more remote install work on Kaffe’s latest piece she’s been building while resident at Hai Art in Hailuoto, an island in the north of Finland. The zone building, site specific sample composing and microscopic Beagleboard log debugging is over, and two new GPS Opera bikes are born! Go to Hai Art or Kaffe’s site for more details.
Prepare your bicycle clips! Kaffe Matthews and I are starting work on a new Bicycle Opera piece for the city of Porto, I’m working on a new mapping tool and adding some new zone types to the audio system.
While working on a BeagleBoard from one of the bikes used in the Ghent installation of ‘The swamp that was…’, I found (in true Apple/Google style) 4Mb of GPS logs, taken every 10 seconds during the 2 month festival that I forgot to turn off. Being part of a public installation (and therefore reasonably anonymised :) – this is the first 5th of the data, and about all it was possible to plot in high resolution on an online map:
It’s interesting to see the variability of the precision, as well as being able to identify locations and structures that break up the signal (such as the part underneath a large road bridge).
Jaye Louis Douce, Ruth Ross-Macdonald and I took to the ramps of Mount Hawke skate park in deepest darkest Cornwall to test the prototype tracker/projection mapper (now know as ‘The Cyber-Dog system‘) in it’s intended environment for the first time. Mount Hawke consists of 20,000 square feet of ramps of all shapes and sizes, an inspiring place for thinking about projections and tracing the flowing movements of skaters and BMX riders.
Finding a good place to mount the projector was the first problem, it was difficult to get it far enough away to cover more than a partial area of our chosen test ramp – even with some creative duct tape application. Meanwhile the Kinect camera was happily tracking the entire ramp, so we’ll be able to fix this by replacing my old battered projector with a better model in a more suitable location.
The next challenge is calibrating the projection mapping to align it with what the camera is looking at. As they are in different places this is quite fiddly and time consuming to get right, some improvements to the fluxus script will make it faster. Here is Jaye testing it once we had it lined up:
Next it was time to recruit some BMX test pilots to give it a go:
At higher speed it needs a bit of linear interpolation to ‘connect the dots’, as the visualisation is running at 60fps while the tracking is more like 20fps:
This test proved the fundamental idea, and opens up lots of possibilities, different types of visualisations, recording/replaying paths over time as well as the possibility of identifying individual skaters or BMX riders with computer vision. One great advantage this setup has is once it’s running it will work all the time, with no need for continuous calibration (as with RGB cameras) or the use of any additional tracking devices.
As the coder for “The swamp that was…” bike opera, my view of things was from “inside” the bikes – listening to the GPS data and playing samples. So it was super (and somewhat surreal) to finally become a rider and take one of the bikes (called Nancy) for a spin through the streets of Ghent to experience it like everyone else at the Electrified festival.
I followed the different routes, and tried some out backwards and got lost in the “garden” – the zone of mysterious ghost butterflies and wandering sounds. During the end of the final route shelter had to be sought in Julius de Vigneplein during a gigantic thunderstorm, to the sound of looping saxophones before retreating back to the Vooruit.
It didn’t crash (always my main preoccupation with testing something I’ve been involved with writing software for) and there seemed to be continuous audio from the routes. Once I had ascertained that the software seemed to be working properly I could actually start to pay attention to the sounds which were a very fluid mix, interspersed with sudden bursts of Flemish – recordings of local people.
The sounds are a widely varied mix ranging from digital glitch to ethereal sounds and processed ducks that accompany you as you cycle along the canals. The “garden” is not a route as such but occupies a maze of small streets in the Ledeberg area and populates the streets with many insects, birds and other surprises.
The custom bike/speaker arrangement designed and built by Timelab was satisfyingly loud – pulling up next to other innocent cyclists at junctions with blaring jazz is quite an intriguing social experience. It makes you want to say “I can’t turn it off” or “I am an art installation!” The beagleboards also seem fairly durable, as the bikes have been running for a month now, and the cobbled streets and some areas with bumpy roadworks give them a lot of shocks to cope with.
The “click click” of car indicator relays tell you when you’ve reached junctions where you have to turn, and while our method of calculating direction (by comparing positions every 10 seconds) doesn’t really work well enough, they still had a useful role, saying “pay attention, you need to turn here!”. This installation, and the rest of the festival will be running for another month, until the 4th November.
I spent last week working on various activities associated with the Electrified festival in Ghent, which included a mix of plant care, games dev, low level android audio hacking, beagleboard-bike fixing. Here are some photos of the Borrowed Scenery installation/physical narrative, home of the mysterious patabotanists and temporary research laboratory for FoAM – excellent for getting into the spirit of the work while developing it. More details in further posts.
This exhibition consists of dot matrix printouts, large scale plotter prints, photographs, German VHS cassettes and a mass of technology – and clearly focuses on how it interferes with us physically though our bodies, and in a wider scale through our geography via mapping and recording experiments. His work is presented here without explanation, which means it must be taken on face value – quite a challenge as so much of the material he’s working with is invisible, or hidden inside both intricate custom devices and reclaimed/adapted circuitry from various sources.
It’s a challenge I like a lot, and equally intriguing is the difficulty in detecting ‘edges’ of the different exhibits on display. Extensive use of EM radiation (transmission and reception) means the whole thing seems to be alive, working as a whole – signals spilling over into each other, with surging, clicking and roaring. Moving close to a massive coil, the attached drill becomes activated by the circuitry on my camera when I take a picture, and doesn’t calm down till I move away. Video documentation shows how devices recorded signals from the surrounding landscape were used to generate the images on the walls of the gallery.
The devices are also communicating with the heavy industrial activity outside in the ship repair dockyard. The muted vibrations of hammering seemed be communicating with the tattooing device controlled by process information from an attached Linux laptop.
Chordpunch was set up to explore the many and moving forms of algorithmic music. That might mean a computer program generating every note you hear, or new electronic music inspired by algorithms, or human beings following interesting rules with musical outcomes.
Still in Paris, and still concerning slub, we are also featured as part of the Form@ts virtual exhibition at Jeu de Paume curated by Christophe Bruno. The exhibition concerns artwork, such as livecoding, that crosses borders of format and convention.
A chance to unleash some participatory musical robot livecoding in Falmouth this weekend, with an Al Jazari installation at the relaunch and opening event of the Fish Factory Arts space. The last couple of galleries it’s been running in I’ve been unable to be physically present, so it was a good chance to get some feedback and pay careful attention to what people do.
While the robot programming is very simplified from the original version, there is still quite a steep learning curve. The learning process is audible and largely depends if a group of friends or an individual is having a go.
The programming seems to take several stages:
1. People initially experiment with single instructions, resulting in simple, slow beat with a single robot.
2. Learning how to navigate around the program and place more instructions comes next, resulting in complex but disorganised sounds. At this point often more people are attracted to join in.
3. Making more structured behaviours, palindromic patterns, repeating drum beats – people who get this far tend to stay for a while, working together programming all robots to coordinate their sounds.