Category Archives: Livecoding

Livecoding UAVs for environmental research

Some screenshots of the UAV livecoding visual programming language. Weather being on our side, we’re planning some test flights later this week! The first program uses GPS to take photos with an overlap of 50% at 300 metres altitude, based on the vertical camera angle as reported from the device. It assumes the the flight orientation is level:

Screenshot_2015-01-20-23-17-49

The blocks are all drag and drop and get converted into Scheme code which is run by a modified tinyscheme interpreter. The code can be saved and loaded, and I’m planning to make it possible for people to share code via email.
This is a simpler program which takes a photo every 3 seconds and records a handful of sensor data to the database:

Screenshot_2015-01-20-23-18-44

At the bottom you can see a squashed camera preview – I’ve tried various approaches (hiding, scaling to 0 pixels etc) but android requires that there is a preview somewhere in order to take a photo properly. You can view the recorded data on the device too, for checking. There is also a ‘flight mode’ which locks and turns off the screen, and ignores all button events. On some phones you need to take out the battery to stop the program running but unfortunately on others you can still use the power button to close the program.

Screenshot_2015-01-21-01-09-56

Visual programming for environmental research with UAVs

ua3

I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?

Enumerating and displaying all the sensors on a phone
Enumerating and displaying all the sensors on a phone

This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.

Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.

Nested drag/droppable code which gets converted to Scheme code for the tinyscheme interpreter
Nested drag/droppable code which gets converted to Scheme for the tinyscheme interpreter
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense - just testing!)
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense – just testing!)
The "Hello World" program, displays every 3 seconds  (even when the app is running in the background)
The "Hello World" program, displays every 3 seconds (even when the app is running in the background)

Flotsam: A prototype screenless livecoding language

Two languages are working with Flotsam, the new name for the prototype screenless tangible programming language I’ve been building (which comes from the fact it’s largely made from driftwood). It’s somehow already been featured on the Adafruit blog!

The circuit seems to be fully debugged now, with short circuits fixed – which took a little while and more than a little frustration :) The Raspberry Pi python code is currently on the weavingcodes repository (more on this project on the kairotic site), and the first language is a declarative style L system for describing weave structure and pattern with yarn width and colour. The LEDs indicate that the evaluation happens simultaneously, as this is a functional language. The blocks represent blue and pink yarn in two widths, with rules to produce the warp/weft sequence based on the rows the blocks are positioned on:

The weaving simulation is written in pygame (which I’ve been using lately for teaching), and is deliberately designed to make alternative weave structures than those possible with Jacquard looms by including yarn properties. The version in the video is plain weave, but more complex structures can be defined as below – in the same way as Alex’s gibber software:

star

This is a completely different language for building shapes in Minecraft, and is an imperative, stack based language for driving a turtle in 3D space. Eventually (when I’ve manufactured a few more programming blocks) it will be possible to change Minecraft block materials and react to player actions. The LEDs indicate here the more sequential evaluation of this Forth like language:

All that’s needed to switch languages is to redraw the symbols with chalk and run a different script. It won’t be truly screenless until I write a musical language for it, which is obviously coming soon…

IMG_20141016_142021

Learning to read, notate and compute textiles in Aarhus

Setting off from Copenhagen, the weaving codes tour continued as Emma Cocker, Alex McLean, Ellen Harlizius-Klück and I sped across the Danish countryside on the train. We were heading for Aarhus to spend some time working with people at the other end of the technology spectrum – The Center for Participatory IT in Aarhus University.

We were invited by Geoff Cox to run a workshop over two days, and given the nature of the faculty and time working together being at a premium on this project, we decided to to run the workshop more as invitation to join in our work (and therefore learn a bit about how artistic research is done), rather than having specific material to cover.

Workshop participants working in Aarhus Lots of pieces of woven fabric

Alex and I needed to learn from Ellen how to ‘read’ a textile sample and notate it’s structure, a kind of reverse engineering process. Day one consisted of sharing out different types of weave and carefully figuring out the crossing points of warp and weft. The first challenge is to attempt to find the smallest repeat of the pattern, then record on graph paper a cross where the warp threads show over the weft threads. The big surprise is that the weave does not obviously relate to the visible pattern. It can also be hard to determine from a small sample which direction is warp and which is weft, so you can choose this arbitrarily if need be.

Alex recording warp crossings on graph paper

We could then compare different weaves, and also different notation styles that people used – little shorthand ways of describing large areas of plain weave for example. Much of the fabric came from Ellen’s Pepita Virus exhibition, and contained many different sizes of the “dog tooth” pattern. By comparing the notation we had all recorded for our different samples, we could tell that to scale up a pattern it’s not the case that you can also scale up the weave – the structure needs to change completely between 4 or 5 different types.

Example fabric and notation Example fabric and notation

Day two consisted of testing Alex’s Javascript loom/computer, by entering the weave structure and colour sequences and checking if the resulting patterns matched. By livecoding, we could also play with the patterns without the need to physically weave them, in order to gain a better insight into how changes affected the resulting pattern – and a better overall understanding of the weaving process.

Ellen pointing at code Two workshop participants programming patterns

To finish up, I talked about the patterns inherent in computation, based on my previous Z80 explorations to show how the pattern creating origins of computation are still present, and the role of programming languages as ways for people to come to shared understanding via notation rather than simply telling a machine what to do.

Unravelling technology in Copenhagen

Last week the weavingcodes/codingweaves project started with a trip to Denmark, our first stop was the Centre for Textiles Research in Copenhagen where we presented the project and gathered as much feedback as possible right at the beginning. The CTR was introduced to us by Eva Andersson Strand, and is an interdisciplinary centre which focuses on the relationships between textiles, environment and society.

IMG_20141006_150755

This long-view perspective of technology is critical for us, as we are dealing with a combination of thinking in the moment via livecoding and a history of technology dating back to the neolithic. This is a warp weighted loom, the focus of much of Ellen Harlizius-Klück’s research and the technology we are going to be using for the project.

IMG_20141007_114259959

Weights like this are widespread in the archaeological record for many cultures around the world, with the earliest ones around 5000 BC. Similarly – a post-it note including a handy cuneiform translation:

IMG_20141007_165613

Alex talked about livecoding as a backwards step, removing the interface – thinking about it as an unravelling of technology. His introduction to Algorave led to many connections later when Giovanni Fanfani described the abstract rhythmic patterns of Homeric rhapsodic poetry. These were performed by citizens, in a collaborative and somewhat improvised manner – the structures they form musically and in language are potentially of interest as they seem to echo the logic of weaving pattern.

IMG_20141006_143209

Ellen described her research into tacit knowledge of ancient Greek society – how weaving provided thinking styles and ordering concepts for the earliest forms of mathematics and science which is the basis for much of the weavingcodes project. One additional theme that has come up fairly consistently is cryptography – Flavia Carraro’s description of ‘The Grid in the decipherment of the Linear B writing system: a “paper-­‐loom”?’ was another addition to this area.

Emma Cocker talked about Peneolopeian time – constant weaving and unravelling as a subversive act, and the concept of the kairos, as a timely action – the name given to the point at which the weft is made when the warp ‘shed’ is provided, as well as a part of the warp weighted loom. Her input was to provide a broader view to our explorations (as coders, weavers and archaeologists all tend to get caught up in technical minutiae from time to time). From our discussions it was apparent that one of the strongest connections between livecoding and ‘weaving as thought’ is a subversion of a form of work that is considered by the dominant culture as entirely utilitarian.

Thinking outside of the screen (#1)

I’m starting a new exploratory project to build a screen-less programming language based on two needs:

  • A difficulty with teaching kids programming in my CodeClub where they become lost ‘in the screen’. It’s a challenge (for any of us really but for children particularly) to disengage and think differently – e.g. to draw a diagram to work something out or work as part of a team.
  • A problem with performing livecoding where a screen represents a spectacle, or even worse – a ‘school blackboard’ that as an audience we expect ourselves to have to understand.

I’ve mentioned this recently to a few people and it seems to resonate, particularly in regard to a certain mismatch of children’s ability to manipulate physical objects against their fluid touchscreen usage. So, with my mind on the ‘pictures under glass’ rant and taking betablocker as a starting point (and weaving code as one additional project this might link with), I’m building some prototype hardware to provide the Raspberry Pi with a kind of external physical memory that could comprise symbols made from carved wood or 3D printed shapes – while still describing the behaviour of real software. I also want to avoid computer vision for a more understandable ‘pluggable’ approach with less slightly faulty ‘magic’ going on.

Before getting too theoretical I wanted to build some stuff – a flexible prototype for figuring out what this sort of programming could be. The Raspberry Pi has 17 configurable I/O pins on it’s GPIO interface, so I can use 5 of them as an address lookup (for 32 memory locations to start with, expandable later) and 8 bits as input for code or data values at these locations.

The smart thing would be to use objects that identify themselves with a signal, using serial communication down a single wire with a standard protocol. The problem with this is that it would make potential ‘symbol objects’ themselves fairly complicated and costly – and I’d like to make it easy and cheap to make loads of them. For this reason I’m starting with a parallel approach where I can just solder across pins on a plug to form a simple 8 bit ID, and restrict the complexity to the reading hardware.

Testing the 74HC4067 16-channel analog mult iplexer/demultiplexer
Testing the 74HC4067 16-channel multiplexer/demultiplexer

I’ve got hold of a bunch of 74HC4067 multiplexers which allow you to select one signal from 16 inputs (or the other way around), using 4 bits – and stacking them up, one for each 8 bits X 16 memory locations. This was the furthest I could go without surface mount ICs (well out of my wonky soldering abilities).

Building the board, (with narrow 24 pin IC holders sliced down the middle). The input comes in via a common bus down the centre of the board.
Building the board, (with narrow IC holders hacked by slicing them down the middle). The input comes in via a common bus down the centre of the board.
Solder practice
Solder practice
Testing the first 4 bits on the breadboard
Testing the first 4 bits on the breadboard

Now 4 bits are working it’s harder to test with an LED – so next up is getting the Raspberry Pi attached.

Neural Network livecoding and retrofitting ZX Spectrum hardware

An experimental, and quite angry neural network livecoding synth (with an audio ‘weave’ visualisation) for the ZX Spectrum: source code and TZX file (for emulators). It’s a bit hard to make out in the video, but you can move around the 48 neurons and modify their synapses and trigger levels. There are two clock inputs and the audio output is the purple neuron at the bottom right. It allows recurrent loops as a form of memory, and some quite strange things are possible. The keys are:

  • w,d: move diagonally north west <-> south east
  • s,e: move diagonally south west <-> north east
  • t,y,g,h: toggle incoming synapse connections for the current neuron
  • space: change the ‘threshold’ of the current neuron (bit shifts left)

This audio should load up on a real ZX Spectrum:

One of the nice things about tech like this is that it’s easily hackable – this is a modification to the video output better explained here, but you can get a standard analogue video signal by connecting the internal feed directly to the plug and detaching the TV signal de-modulator with a tiny bit of soldering. Look at all those discrete components!

IMG_20140809_125815

Thinking Digital 2014

Last week I had the honour of both performing with Alex and presenting at Thinking Digital 2014. Suzy O’Hara invited me to represent the intersection of art, science and education of FoAM kernow and present the work I’ve been doing with the Sensory Ecology Group at Exeter University. I did a quick Egglab game demo and related some thoughts on working with scientists and how it connects with my experience teaching programming in the classroom.

It was an interesting and unusual venue for me, organiser Herb Kim is very much developing on the TED theme – so lots of extremely well considered, motivating and inspiring talks. Much of the context was one of venture capital and startup business so it was interesting to see an explosive talk by Aral Balkan on the implications of Facebook and Google’s business models on the future of our society (he included some of the other presenter’s companies too). This reminded me very much of the themes we explored in Naked on Pluto, but coming from a new angle.

Personally his talk was challenging to me as he roundly attacked the free software movement, for essentially providing a great sandbox for enthusiasts and well funded companies – but incapable of doing much more in terms of data security for real people. As a designer, he sees this as essentially a design problem – one that these companies have solved for themselves but is utterly lacking in devices such as the Firefox phone OS. For Aral, this is fundamentally a design problem that needs it’s own movement, and new business models to be developed. These business models need to take into consideration long term usability (for which user privacy is an essential feature) rather than ultra short term profit ‘pump and dump’, selling of people’s information for vast amounts – i.e. silicon valley ‘business as usual’.

Two things are apparent to me following this talk – one is that I have been labouring under the impression that a particular focus on design is somehow implicitly tied with specific business practices – simplification as wallpapering over data harvesting, and other tactics. This is very much a short sighted developer view, and is wrong – they can of course service different types of businesses.

The other point came during his 3 slide explanation of how to start your own social network (1. fork a github repo, 2. set up a server and 3. install it). Clearly even this satirical simplification is beyond all but existing software developers (many of whom are working for companies reliant on user surveillance in some indirect or direct way). The challenge for me is that I can’t ultimately see a way to make ‘interface as user experience’ ever converge on anything other than exploitation. Can ‘user experience’ ever regard people philosophically as anything but consumers – regardless of the underlying business model?

The problem in solving that is that we now have two problems – the terrible state of software engineering preventing accessibility (i.e programming at large still stuck in the 70’s) and the lack of understanding in society of what a computer is and how it works. The second of these problems is being addressed in some part by the activities of CodeClub (Aral is [correction: was] a director of this organisation) and similar education initiatives. Regarding pushing software engineering forward, in some way I think recent livecoding takeup by musicians over programmers is a fascinating development here, in terms of showing us how programming – when it’s taken and twisted into very strange and new forms, can start to make sense and work for ‘real people’.

Thoughts on teaching programming with Minecraft and Python

Saturday saw the first dBsCode taster workshop, for budding programmers between 11 and 16. We set up 20 Raspberry Pi’s, which we networked together and used our new procedural Minecraft 3D shape primitives to build a number of projects in Python involving castles, spiders and an infinite house generator.

dbsmcpi-castle

dbscode

Networking was very important – it enabled them to jump into each other’s games which started as a major distraction, but ended up being useful – as people could see what each other were doing and work together.

As part of this, some level of ‘griefing’ was present (where other players interfere in your world), but asking them about this during the breaks, the consensus was that this is part of the culture of Minecraft – I even instructed victims to pull their network cables out, but they saw that as completely unnecessary. Most of them took turns in both constructive/collaborative and programming activity and destructive/graffiti like activities associated with griefing, and this was acceptable to them. The use of code to easily rebuild structure after damage (or being able to remove people with liberal procedural application of lava) was therefore quite an attractive feature of the code approach.

It was common to adapt the programming to fit their understanding of Minecraft, so it seemed natural to take on a hybrid approach to building – using Python to do the heavy lifting (building, or extracting large areas) or repetitive tasks, after which they did the fiddly or more creative bits manually as normal. For example one student used the sphere primitive to extract huge caves underground, and then filled them with little hand made shrines and sculptures.

What was interesting was that to some extent the ‘experts’ at Minecraft (many of whom already document their creations on Youtube channels) found this more challenging in many ways than those less attached to the Minecraft culture – who were more accepting of new ways of doing things.

The manner of programming we used here – running Geany (a lightweight Python IDE) at the same time as Minecraft, was completely in line with livecoding practice, as programs interacted with the Minecraft world in realtime with network messages. The great thing about the rather slow network bottleneck is that you can walk around a structure while the program is creating it, which allows a much better understanding of how the process is working than if it instantly popped into existence.