Category Archives: Livecoding

Red King: Host/Parasite co-evolution citizen science

A new project begins, on the subject of ecology and evolution of infectious disease. This one is a little different from a lot of Foam Kernow’s citizen science projects in that the subject is theoretical research – and involves mathematical simulations of populations of co-evolving organisms, rather than the direct study of real ones in field sites etc.

The simulation, or model, we are working with is concerned with the co-evolution of parasites and their hosts. Just as in more commonly known simulations of predators and prey, there are complex relationships between hosts and parasites – for example if parasites become too successful and aggressive the hosts start to die out, in turn reducing the parasite populations. Hosts can evolve to resist infection, but this has an overhead that starts to become a disadvantage when most of a population is free of parasites again.

graph
Example evolution processes with different host/parasite trade-offs.

Over time these relationships shift and change, and this happens in different patterns depending on the starting conditions. Little is known about the categorisation of these patterns, or even the range of relationships possible. The models used to simulate them are still a research topic in their own right, so in this project we are hoping to explore different ways people can both control a simulation (perhaps with an element of visual live programming), and also experience the results in a number of ways – via a sonifications, or game world. The eventual, ambitious aim – is to provide a way for people to feedback their discoveries into the research.

sketch

The UAV toolkit & appropriate technology

The UAV toolkit’s second project phase is now complete, the first development sprint at the start of the year was a bit of research into what we could use an average phone’s sensors for, resulting in a proof of concept remote sensing android app that allowed you to visually program different scripts which we then tested on some drones, a radio controlled plane and a kite.

photo-1444743618-504214

This time we had a specific focus on environmental agencies, working with Katie Threadgill at the Westcountry Rivers Trust has meant we’ve had to think about how this could be used by real people in an actual setting (farm advisors working with local farmers). Making something cheap, open source and easy to use, yet open ended has been the focus – and we are now looking at providing WRT with a complete toolkit which would comprise a drone (for good weather) a kite (for bad weather/no flight licences required) and an android phone so they don’t need to worry about destroying their own if something goes wrong. Katie has produced this excellent guide on how the app works.

The idea of appropriate technology has become an important philosophy for projects we are developing at Foam Kernow, in conjunction with unlikely connections in livecoding and our wider arts practice. For example the Sonic Bike project – where from the start we restricted the technology so that no ‘cloud’ network connections are required and all the data and hardware required has to fit on the bike – with no data “leaking” out.

photo-1444742698-352249

With the UAV toolkit the open endedness of providing a visual programming system that works on a touchscreen results in an application that is flexible enough to be used in ways and places we can’t predict. For example in crisis situations, where power, networking or hardware is not available to set up remote sensing devices when you need them most. With the UAV toolkit we are working towards a self contained system, and what I’ve found interesting is how many interface and programming ‘guidelines’ I have to bend to make this possible – open endedness is very much against the grain of contemporary software design philosophy.

The “app ecosystem” is ultimately concerned with elevator pitches – to do one thing, and boil it down to the least actions possible to achieve it. This is not a problem in itself, but the assumption that this is the only philosophy worth consideration is wrong. One experience that comes to mind recently is having to make and upload banner images of an exact size to the Play Store before it would allow me to release an important fix needed for Mongoose 2000, which is only intended to ever have 5 or 6 users.

photo-1444743271-137365

For the UAV toolkit, our future plans include stitching together photos captured on the phone and producing a single large map without the need to use any other software on a laptop. There are also interesting possibilities regarding distributed networking with bluetooth and similar radio systems – for example sending code to different phones is needed, as currently there is no way to distribute scripts amongst users. This could also be a way of creating distributed processing – controlling one phone in a remote location with another via code sent by adhoc wifi or SMS for example.

Weavecoding performance experiments in Cornwall

Last week the weavecoding group met at Foam Kernow for our Cornish research gathering. As we approach the final stages of the project our discussions turn to publications, and which ideas from the start need revisiting. While they were here, I wanted to give local artists and researchers working with code and textiles a chance to meet Ellen, Emma and Alex. As we are a non-academic research organisation I wanted to avoid the normal powerpoint talks/coffee events and try something more informal and inclusive.

IMG_1650

One of the original ideas we had was to combine weaving and coding in a performance setting, to both provide a way to make livecoding more inclusive with weaving, and at the same time to highlight the digital thought processes involved in weaving. Amber made vegetarian sushi for our audience and we set up the Jubilee Warehouse with a collection of experiments from the project:

  • The newly warped table loom with a live camera/projection from underneath the fabric as it was woven with codes for different weaves on post-it notes for people to try.
  • The tablet/inkle loom to represent ancient weaving techniques.
  • The pattern matrix tangible weavecoding machine and Raspberry Pi.
  • A brand new experiment by Francesca with a dancemat connected to the pattern matrix software for dance code weaving!
  • The slub livecoding setup.

IMG_1634

This provided an opportunity for people to try things out and ask questions/provide discussion starting points. Our audience consisted of craft researchers, anthropological biologists, architects, game designers and technologists – so it all went on quite a lot longer than we anticipated! Alex and I provided some slub livecoded music to weave by, and my favourite part was the live weaving projection – with more projectors we could develop this combination of code and weaving performance more. Thanks to Emma for all the videos and photos!

IMG_1692

IMG_1564

Airborne drag-drop programming, the next steps

This autumn we are continuing work on the UAV toolkit with Karen Anderson and her research group at the Environment and Sustainability Institute. This time we have a mission to help the Westcountry Rivers Trust by coming up with fast and cheap ways they can build maps of farms to determine water run-off problems, which gives farmers proof they need to get funding to fix pollution issues.

IMG_20151014_154601
Flight planning operations and photo checking

In order to make the software usable in this case, we decided on two directions. On the one hand there needs to be a simple way to start and stop programs (or “flight modes”) that read sensor data, as well as defining certain global settings, ie. flight altitude, desired image coverage etc. At the same time, the code to define what this does needs to still be programmable in the app – and more complex behaviours need to be possible to support both kites and UAVs. Our philosophy is that it has to be open ended, as we don’t know where the toolkit it might be useful (ie. crisis mapping situations) or what new sensors will be available on a device in the future.

Screenshot_2015-10-19-21-48-01
The new main screen

One specific set of new behaviours we need is for kite mapping. We already have the ability to choose when to take pictures based on GPS and altitude, but with a kite there can be lots of turbulence and the camera is in a much less controlled state, flipping around taking shots of the sky etc. So we need to calculate things like jerk from change in acceleration and use orientation sensors to only take photos when the lens is pointing directly down, within some degree of acceptable margin.

Below is a section of the code that calculates if we are pointing down using the magnetometer and accelerometer – the drag drop visual code can now be used to build normal Scheme functions using a touchscreen (a bit like scheme bricks). In fact I managed to do all of this work on the phone. There are now two types of code, the main programs or “flight modes” that you can run from the front screen, and a library of editable functions which they use. This means there are now three levels that the software can be used – using it without needing to see any of the code at all, editing the basic behaviour like which sensor’s data are captured, and finally modifying the more detailed code to make it do completely new things.

Screenshot_2015-10-19-22-08-12

Picademy Exeter and Future Thinking for Social Living

Last week I had the chance to help out the Raspberry Pi foundation at their Picademy in Exeter. It was good to meet up with Sam Aaron again to talk livecoding on Pis, and also see how they run these events. They are designed for local teachers to get more confident with computers, programming and electronics to the point where they can start designing their own teaching materials on the second day of the two day course. This is a model I’m intending to use for the second inset teacher training day I’m doing next week at Truro school – it’s pretty exciting to see the ideas that they have for activities for their pupils, and a good challenge to help find ways to bring them into existence in a day.

IMG_20150605_141147

We also had the ending of Future Thinking for Social Living at the Miners Court summer party last week. We exhibited the map made during the workshops, made lots of tea, and had some fun with the pattern matrix in musical mode out in the garden – I adapted Alex’s music system we used with Ellen in Munich to run on Raspberry Pi so it didn’t require a laptop, or a screen at all – simply a speaker. It was interesting how quickly people got the idea, in many ways music is easier to explain than weaving as listening while coding is multi-sensory.

IMG_20150603_151747

Weavecoding Munich

Ellen’s exhibition in Munich was always going to be a pivotal event in the weavecoding project – one of the first opportunities to expose our work to a large audience. The Museum of casts of classical sculptures was the perfect context for the mythical aspects of weaving, overlooked by Penelope and friends with her subversive woven/unwoven work, we could explore the connections between livecoding and weaving.

IMG_8477 2

Practically we focused on developing the tangible weavecoding exhibit for events later in the week, as well as discussing the many languages we have developed so far for different looms and weaving techniques. One of our discoveries is that none of the models or languages we have created seem sufficient in themselves – weaving could be far too big to be able to be described or solved from a single perspective. We’ve tried approaches describing weave structures from the actions of the weaver, setup of the loom and structure of the fabric – perhaps the most promising is to explor the story of weaving from the perspective of the thread itself.

IMG_20150510_062737

IMG_20150508_153211

One of the distinctive things about weaving in antiquity is how multiple technologies were combined to form a single piece of fabric, weaving in different directions, weft becoming warp, use of tablets vs warp weighted weaving. To explain this via the path of a single conceptual thread crossing through itself may make this possible to describe in a more flexible, declarative and abstracted manner than having to explain each method separately as if in it’s own world.

IMG_20150508_152045

IMG_20150509_100421

The pattern matrix has now been made into good shape for explaining the relationship between colour and structure in pattern formation. For the first time we also used all 4 sensors per block on the bottom row which meant we could use a special “colour” block that the system recognises from the normal warp/weft ones and use it’s rotation to choose between 8 preset colour settings. This was quite a breakthrough as it had all been theoretical before.

IMG_20150508_194220

Adding this more complex use of the magnetic patterns meant that Alex could set up the matrix as a tangible interface for his tidal livecoding software meaning Ellen could join us for a collaborative slub weavecoding performance on the Saturday evening. The prospect of performing together was something we have talked about since the very beginning of the project, so it was great to finally reach this point. The reverb in the museum was vast, meaning that we had to play the space a lot, and provide ‘music for looking at sculptures by’:

Loose threads from weavecoding

Midway through the weavecoding project and our researches have thrown up a whole load of topics that either don’t quite fit into our framework, or we simply won’t have time to pursue properly. Here are some of the tangents I’ve collected so far.

Coding with knots: Khipu

One of the cultures I’m increasingly interested in are the Incas. Their empire flourished to up to 37 million people, without the need of money or a written language. We know that some numeric information was stored using Khipu, a knot based recording system which was used in combination with black and white stones to read and calculate. Two thirds of the quipus we have are un-translated, and do not fit into the known numeric coding system – what information do they hold?

quipu

Harvard University provides a Khipu Database Project with many surviving examples documented – I’m hoping to run a workshop soon to look through some of this data in a variety of ways.

Tablet weaving NAND gates

Image2
Diagram thanks to Phiala’s String Page – the only place I’ve seen tablet weaving explained properly.

There are logic gates in tablet weaving logic. I haven’t fully figured this out yet, but I noticed modelling tablet weaving that you end up basically mapping the combinations of the weaving actions (such as turn direction) and colour as truth tables.

Top face colour based on top left/top right hole yarn in a single card and turn direction (clockwise/counter clockwise)

TL Yarn : TR Yarn : Turn : Top face colour
--------------------------------------------
Black   : Black   : CCW  : Black
Black   : Black   : CW   : Black
Black   : White   : CCW  : Black
Black   : White   : CW   : White
White   : Black   : CCW  : White
White   : Black   : CW   : Black
White   : White   : CCW  : White
White   : White   : CW   : White

Things get stranger when you include twist and combinations of actions with multiple cards. Would it be possible to compile high level programming languages into weaving instructions for carrying out computation? Perhaps this is what the untranslatable quipus are about?

Nintendo made a knitting machine

We could really do with some of these, unfortunately they never went beyond prototype stage.

nintendo

Asemic writing

Asemic writing is a post-literate written form with no semantic content. Miles Visman programs procedural asemic languages and hand weaves them. I think this may be an important connection to livecoding at some point.

aw001

Screenless music livecoding

Programming music with flotsam – for the first time, it’s truly screen-less livecoding. All the synthesis is done on the Raspberry Pi too (raspbian release in the works). One of the surprising things I find with tangible programming is the enforced scarcity of tokens, having to move them around provides a situation that is good to play around with, in contrast to being able to simply type more stuff into a text document.

The programming language is pretty simple and similar to the yarn sequence language from the weavecoding project. The board layout consist of 4 rows of 8 possible tokens. Each row represents a single l-system rule:

Rule A: o o o o o o o o
Rule B: o o o o o o o o
Rule C: o o o o o o o o
Rule D: o o o o o o o o

The tokens themselves consist of 13 possible values:

a,b,c,d : The 'note on' triggers for 4 synth patches
. : Rest one beat
+,- : Change current pitch
<,> : Change current synth patch set
A,B,C,D : 'Jump to' rule (can be self-referential)
No-token: Ends the current rule

The setup currently runs to a maximum depth of 8 generations – so a rule referring to itself expands 8 times. A single rule ‘A’ such as ‘+ b A - c A ‘ expands to this sequence (the first 100 symbols of it anyway):

+b+b+b+b+b+b+b+b+b+b-c-c+b-c-c+b+b-c-c+b-c-c+b+b+b-c-c+b-c-c+b+b-c-c+b-c-c+b+b+b+b-c-c+b-c-c+b+b-c-c

I’m still working on how best to encode musical structures this way, as it needs a bit more expression – something to try is running them in parallel so you can have different sequences at the same time. With a bit more tweaking (and with upcoming hardware improvements) the eventual plan is to use this on some kid’s programming teaching projects.