Monthly Archives: September 2012

Swamp bike opera impressions…


Photo thanks to zzkt

As the coder for “The swamp that was…” bike opera, my view of things was from “inside” the bikes – listening to the GPS data and playing samples. So it was super (and somewhat surreal) to finally become a rider and take one of the bikes (called Nancy) for a spin through the streets of Ghent to experience it like everyone else at the Electrified festival.

I followed the different routes, and tried some out backwards and got lost in the “garden” – the zone of mysterious ghost butterflies and wandering sounds. During the end of the final route shelter had to be sought in Julius de Vigneplein during a gigantic thunderstorm, to the sound of looping saxophones before retreating back to the Vooruit.

It didn’t crash (always my main preoccupation with testing something I’ve been involved with writing software for) and there seemed to be continuous audio from the routes. Once I had ascertained that the software seemed to be working properly I could actually start to pay attention to the sounds which were a very fluid mix, interspersed with sudden bursts of Flemish – recordings of local people.

The sounds are a widely varied mix ranging from digital glitch to ethereal sounds and processed ducks that accompany you as you cycle along the canals. The “garden” is not a route as such but occupies a maze of small streets in the Ledeberg area and populates the streets with many insects, birds and other surprises.

The custom bike/speaker arrangement designed and built by Timelab was satisfyingly loud – pulling up next to other innocent cyclists at junctions with blaring jazz is quite an intriguing social experience. It makes you want to say “I can’t turn it off” or “I am an art installation!” The beagleboards also seem fairly durable, as the bikes have been running for a month now, and the cobbled streets and some areas with bumpy roadworks give them a lot of shocks to cope with.

The “click click” of car indicator relays tell you when you’ve reached junctions where you have to turn, and while our method of calculating direction (by comparing positions every 10 seconds) doesn’t really work well enough, they still had a useful role, saying “pay attention, you need to turn here!”. This installation, and the rest of the festival will be running for another month, until the 4th November.

Borrowed Scenery

I spent last week working on various activities associated with the Electrified festival in Ghent, which included a mix of plant care, games dev, low level android audio hacking, beagleboard-bike fixing. Here are some photos of the Borrowed Scenery installation/physical narrative, home of the mysterious patabotanists and temporary research laboratory for FoAM – excellent for getting into the spirit of the work while developing it. More details in further posts.

Execution: a solo exhibition by Martin Howse

Some images of Martin Howse’ solo exhibition at the Fish Factory (Falmouth’s experimental gallery/reclaimed art space).

This exhibition consists of dot matrix printouts, large scale plotter prints, photographs, German VHS cassettes and a mass of technology – and clearly focuses on how it interferes with us physically though our bodies, and in a wider scale through our geography via mapping and recording experiments. His work is presented here without explanation, which means it must be taken on face value – quite a challenge as so much of the material he’s working with is invisible, or hidden inside both intricate custom devices and reclaimed/adapted circuitry from various sources.

It’s a challenge I like a lot, and equally intriguing is the difficulty in detecting ‘edges’ of the different exhibits on display. Extensive use of EM radiation (transmission and reception) means the whole thing seems to be alive, working as a whole – signals spilling over into each other, with surging, clicking and roaring. Moving close to a massive coil, the attached drill becomes activated by the circuitry on my camera when I take a picture, and doesn’t calm down till I move away. Video documentation shows how devices recorded signals from the surrounding landscape were used to generate the images on the walls of the gallery.

The devices are also communicating with the heavy industrial activity outside in the ship repair dockyard. The muted vibrations of hammering seemed be communicating with the tattooing device controlled by process information from an attached Linux laptop.

Plasticine architecture

A patafungi building site for the Aniziz game. The shapes have been inspired by Siteless – an architectural book I absolutely love by François Blanciak. It contains 1000 ideas for building forms inspired by time spent in different cities around the world. This could be a great starting point for all kinds of ideas for levels, worlds or objects in many types of games.

The forms, drawn freehand (to avoid software-specific shapes) but from a constant viewing angle, are presented twelve to a page, with no scale, order, or end to the series.

Touchscreen programming

As more and more people use touchscreens, it still irks me that we lack good ways of programming “on” devices reliant on them (i.e. native feeling – rather than modified text editors). As a result they seem designed entirely around consumption of software (see also the “The coming war on general-purpose computing”).

So lets make them programmable. Recent steps in this direction are based on Jellyfish – an idea to create a kind of locative livecoding virus game (more on that as it unfolds), starting with fluxus on android (now called nomadic) and a good dose of Betablocker DS, mixed with some procedural 3D rendering inspired by the Playstation2’s mad hardware, and icons previously seen on the Supercollider 2012 flier!

This is a screenshot of it’s current early state, with the Linux/Android versions side by side (spot the inconsistency in wireframe colour due to differences in colour material in OpenGL ES). Main additions to the previous android fluxus are texturing and text rendering primitive support. I’m glad to say that pinch-to-zoom and panning are already working on the code interface, but it’s not making too much sense yet to look at.

Aniziz: Keeping it local and independent

A large part of the Aniziz project involves porting the Germination X isometric engine from Haxe to Javascript, and trying to make things a bit more consistent and nicer to use as I go. One of the things I’ve tried to learn from experiments with functional reactive programming is using a more declarative style, and broadly trying to keep all the code associated with a thing in one place.

A good real world example is the growing ripple effect for Aniziz (which you can just about make out in the screenshot above), that consists of a single function that creates, animates and destroys itself without need for any references anywhere else:

game.prototype.trigger_ripple=function(x,y,z) {
    var that=this; // lexical scope in js annoyance

    var effect=new truffle.sprite_entity(
        this.world,
        new truffle.vec3(x,y,z),
        "images/grow.png");

    var len=1; // in seconds

    effect.spr.do_centre_middle_bottom=false;
    effect.finished_time=this.world.time+len;
    effect.needs_update=true; // will be updated every frame
    effect.spr.expand_bb=50; // expand the bbox as we're scaling up
    effect.spr.scale(new truffle.vec2(0.5,0.5)); // start small

    effect.every_frame=function() { 
        // fade out with time
        var a=(effect.finished_time-that.world.time);
        if (a>0) effect.spr.alpha=a; 
   
        // scale up with time
        var sc=1+that.world.delta*2; 
        effect.spr.scale(new truffle.vec2(sc,sc));

        // delete ourselves when done
        if (effect.finished_time<that.world.time) {
            effect.delete_me=true;
        }
    };
}

The other important lesson here is the use of time to make movement framerate independent. In this case it’s not critical as it’s an effect – but it’s good practice to always multiply time changing values by the time elapsed since the last frame (called delta). This means you can specify movement in pixels per second, and more importantly means that things will remain consistent in terms of interactivity within reason on slower machines.

Livenotations gig at Arnolfini – The hair of the horse

Thanks to Farrows Creative we have some great photos of the livenotations performance with Alex McLean, Hester Reeve and me at the Arnolfini a few weeks ago. This was a completely unrehearsed combination of Hester Reeve’s live art and slub’s live coding. A score was made from rocks and stones, using their position and also drawing on them with brushes and water, made temporary with a heat gun. A selection of good branches from Alex’s garden provided a tripod for the camera which allowed us to project the score along with a clock time marker, my code and Alex’s emacs overlaid with a second projector for a multi layer image of what we were all doing.

I could see the output from the camera (running using Gabor’s fluxus addon code) underneath a semi transparent version of scheme bricks, and my original plan was to attempt to read the score in some symbolic way. Instead I found myself using more playful methods, dragging sections of code over particular stones – and switching to using them when Hester worked on the relevant one. Her movements also helped me break out from normal programming flow more than usual, reminding me of nearby unused bits of code and I generally took a slower more considered approach.

As I said in my previous post, this seems like an encouraging direction for livecoding to follow – given how naturally it fits with performance/live art, it seems refreshing. The impulse is to augment this kind of performance with further machine vision and tracking software, but perhaps much like slub’s preference for listening to each other over complex networking, it’s interesting to concentrate on interpretations on a more open ended manner, at least to begin with.

‘The swamp that was’ – a bicycle opera from the ground of Ganda (part 5)

The Bicycle Opera is now live, get your bikes from the Snoepwinkel, Sint-Pietersnieuwstraat 21:

A bicycle opera in Ghent! British sound artist Kaffe Matthews records urban sounds such as music, singing and street sounds. As she combines these with elements from the past, she creates an unseen urban opera. A mobile composition, written for cyclists. You can rent an “audio bike” to explore the streets of Ghent. As you ride past certain spots, sound recordings are played on the speakers of your bike, uncovering the soundtrack of the city piece by piece.

Here is a visualisation of the zones (including the moving ghost zones) across the city.