All posts by dave

Sonic Kayaks: musical instruments for marine exploration

Here is a bit of a writeup of the gubbins going into the sonic kayaks project. We only have a few weeks to go until the kayaks’ maiden voyages at the British Science Festival, so we are ramping things up, with a week of intense testing and production last week with Kirsty Kemp, Kaffe Matthews and Chris Yesson joining us at FoAM Kernow. You can read Amber’s report on the week here.



The heart of the system is the Raspberry Pi 2. This is connected to a USB GPS dongle, and running the sonic bike software we have used in many cities over the last couple of years. We have some crucial additions such as two water temperature sensors and a hydrophone. We have also switched all audio processing over to pure data, so we can do a lot more sound wise – such as sonify sensor data directly.

How to do this well has been a tricky part to get right. There is a trade off between constant irritating sound (in a wild environment this is more of a problem than a city, as we found out in the first workshop) and ‘overcooking’ the sound so it’s too complex to be able to tell what the sensors are actually reporting.


This is the current pd patch – I settled on cutting out the sound when there is no change in temperature, so you only hear anything when you are paddling through a temperature gradient. The pitch represents the current temperature, but it’s normalised to the running minimum and maximum the kayak has observed. This makes it much more sensitive, but it takes a few minutes to self calibrate at the start. Currently it ranges from 70 to 970 Hz, with a little frequency modulation at 90 Hz to make the lower end more audible.

Here it is on the water with our brand new multi-kayak compatible mounting system and 3D printed horn built in blender. The horrible sound right at the start is my rubbish phone.

In addition to this, we have the hydrophone, which is really the star of the show. Even with a preamp we’re having to boost it in pure data by 12 times to hear things, but what we do hear is both mysterious and revealing. It seems that boat sounds are really loud – you can hear engines for quite a long way, useful in expanding your kayak senses if they are behind you. We also heard snapping sounds from underwater creatures and further up the Penryn river you can hear chains clinking and there seems to be a general background sound that changes as you move around.

We still want to add a layer of additional sounds to this experience for the Swansea festival for people to search for out on the water. We are planning different areas so you can choose to paddle into or away from “sonic areas” comprising multiple GPS zones. We spent the last day with Kaffe testing some quick ideas out:

Looking at sea temperature and sensing the hidden underwater world, climate change is the big subject we keep coming back to, so we are looking for ways to approach this topic with our strange new instrument.

Foam Kernow crypto ‘tea party’

Last night we ran an experimental cryptoparty at Foam Kernow. We’d not tried something like this before, or have any particular expertise with cryptography – so this was run as a research gathering for interested people to find out more about it.

One of the misconceptions about cryptography I wanted to start with is that it’s just about hiding things. We looked at Applied Cryptography by Bruce Schneier where he starts with explaining the 3 things that cryptography provide beyond confidentiality:

Authentication. It should be possible for the receiver of a message to ascertain it’s origin; an intruder should not be able to masquerade as someone else.
Integrity. It should be possible for the receiver of a message to verify that it has not been modified in transit; an intruder should not be able to substitute a false message for a legitimate one.
Nonrepudiation. A sender should not be able to falsely deny later that they sent the message.

It’s interesting how confidentiality is tied up with these concepts – you can’t compromise one without damaging the others. Also how these are a requirement for human communication, but we’ve become so used to living without them.

One of the most interesting things was to hear the motivations for people to come along and find out more about this subject. There were general feeling of loss of control over online identity and data. Some of the more specific aspects:

  • Ambient data collection, our identity increasingly becoming a commodity – being modeled for purposes we do not have control over.
  • Centralisation of communication being a problem – e.g. gmail.
  • Never knowing when your privacy might become important eg. you find yourself in an abusive relationship, suddenly it matters.
  • Knowing that privacy is something we should be thinking about but not wanting to. Similar to knowing we shouldn’t be using the car but doing it anyway.
  • Awareness that our actions don’t just affect us, but our families, friends and colleagues privacy – needing to think about them too.
  • Worrying when googling about health, financial or legal subjects.
  • Being aware that email is monitored in the workplace.

We talked about the encryption we already use – gpg for email with thunderbird and Tor for browsing anonymously. One of the tricky areas we talked about was setting this kind of thing up for mobile – do you need specific apps, an entire OS or specific hardware? This is something we need to spend a bit more time looking into.

Personally speaking, on my phone I use a free firewall so I can at least control which apps on my phone can be online – and I only became aware of this from developing for android and seeing the amount of ‘calling home’ that completely arbitrary applications do regularly.

We also discussed asymmetric key pair crytography – how the mathematics meshes so neatly with social conventions, so you can sign a message to prove you wrote it, or sign someone else’s public key to build up a ‘web of trust’.

We didn’t get very practical, this was more about discussing the issues and feelings on the topic. That might be something to think about for future cryptoparties.

More PPU coding on the NES/Famicom

After getting sprites working in Lisp on the NES for our “What Remains” project, the next thing to figure out properly is the background tiles. With the sprites you simply have a block of memory you edit at any time, then copy the whole lot to the PPU each frame in one go – the tiles involve a bit more head scratching.

The PPU graphics chip on the NES was designed in a time where all TVs were cathode ray tubes, using an electron gun to build a picture up on a phosphor screen. As this scans back and forth across the screen the PPU is busy altering its signal to draw pixel colours. If you try and alter its memory while its doing this you get glitches. However, its not drawing all the time – the electron gun needs to reset to the top of the screen each frame, so you get a window of time (2273 cycles) to make changes to the PPU memory before it starts drawing the next frame.

(Trying out thematic images and some overlapping text via the display list)

The problem is that 2273 cycles is not very much – not nearly enough to run your game in, and only enough to update approx 192 background tiles per frame as DMA is a slow operation. It took me a while to figure out this situation – as I was trying to transfer an entire screenful in one go, which sort of works but leaves the PPU in an odd state.

The solution is a familiar one to modern graphics hardware – a display list. This is a buffer you can add instructions to at any time in your game, which are then acted on only in the PPU access window. It separates the game code from the graphics DMA, and is very flexible. We might want to do different things here, so we can have a set of ‘primitives’ that run different operations. Given the per-frame restriction the buffer can also limit the bandwidth so the game can add a whole bunch of primitives in one go, which are then gradually dispatched – you can see this in a lot of NES games as it takes a few frames to do things like clear the screen.

There are two kinds of primitives in the what remains prototype game engine so far, the first sets the tile data directly:

(display-list-add-byte 1)
(display-list-add-byte 2)
(display-list-add-byte 3)
(display-list-end-packet prim-tile-data 0 0 3)

This overwrites the first 3 tiles at the top left of the screen to patterns 1,2 and 3. First you add bytes to a ‘packet’, which can have different meanings depending on the primitive used, then you end the packet with the primitive type constant, a high and low 16 bit address offset for the PPU destination, and a size. The reason this is done in reverse is that this is a stack, read from the ‘top’ which is a lot faster – we can use a position index that is incremented when writing and decremented when reading.

We could clear a portion of the screen this way with a loop (a built in language feature in co2 Lisp) to add a load of zeros to the stack:

(loop n 0 255 (display-list-add-byte 0))
(display-list-end-packet prim-tile-data 0 0 256)

But this is very wasteful, as it fills up a lot of space in the display list (all of it as it happens). To get around this, I added another primitive called ‘value’ which does a kind of run length encoding (RLE):

(display-list-add-byte 128) ;; length
(display-list-add-byte 0) ;; value
(display-list-end-packet prim-tile-value 0 0 2)

With just 2 bytes we can clear 128 tiles – about the maximum we can do in one frame.

Red King progress, and a sonification voting system

We have now launched the Red King simulation website. The fundamental idea of the project is to use music and online participation to help understand a complex natural process. Dealing with a mathematical model is more challenging than a lot of our citizen science work, where the connection to organisms and their environments is more clear. The basic problem here is how to explore a vast parameter space in order to find patterns in co-evolution.

After some initial experiments we developed a simple prototype native application (OSX/Ubuntu builds) in order to check we understand the model properly by running and tweaking it.


The next step was to convert this into a library we could bind to python. With this done we can run the model on a server, and have it autonomously update it’s own website via django. This way we can continuously run the simulation, storing randomly chosen parameters to build a database and display the results. I also set up a simple filter to run the simulation for 100 timesteps and discard parameters that didn’t look so interesting (the ones that went extinct or didn’t result in multiple host or virus strains).

There is also now a twitter bot that posts new simulation/sonifications as they appear. One nice thing I’ve found with this is that I can use the bot timeline to make notes on changes by tweeting them. It also allows interested people an easy way to approach the project, and people are already starting discussions with the researchers on twitter.


Up to now, this has simply been a presentation of a simulation – how can we involve people so they can help? This is a small project so we have to be realistic what is possible, but eventually we need a simple way to test how the perception of a sonification compares with a visual display. Amber’s been doing research into the sonification side of the project here. More on that soon.

For now I’ve added a voting system, where anyone can up or down-vote the simulation music. This is used as a way to tag patterns for further exploration. Parameter sets are ranked using the votes – so the higher the votes are the higher the likelihood of being picked as the basis for new simulations. When we pick one, we randomise one of its parameters to generate new audio. Simulations store their parents, so you can explore the hierarchy and see what changes cause different patterns. An obvious addition to this is to hook up the twitter retweets and favorites for the same purpose.


Cricket Tales released

Cricket Tales is an ambitious citizen science project. 438 days of CCTV footage from the Wild Crickets Research group – the only record of wild behaviour of insects of it’s kind. It turns out that insects have more complex lives and individuality than we thought, and the game is a way of helping uncover this more precisely. For Foam Kernow, this was also a significant project as the biggest production that all three of us have worked on together.


My favorite aspect of this project is that the movies are a strangely different way of viewing an ecosystem, tiny close up areas of a perfectly normal field in northern Spain. The footage is 24 hour, with infrared at night, recording a couple of frames a second only when movement is detected. Some of the videos get triggered when there is simply movement of shadows, but there are plenty of moments that we wouldn’t normally notice. Worms and bugs of all kinds going about their lives, sudden appearances of larger animals or swarms of ants, condensation of dew at dawn. The crickets themselves, mostly with tags stuck to them so we can tell which is which, but other than that – this is their normal habitat and way of life. Compared to the study of insects in lab conditions, it’s not surprising they act in a more complex way.

Screenshots from the Spanish version, as I’m particularly proud of that (my first experience using GNU gettext with Django).

We combined the task of watching the 1 minute long movies with the ability to build houses for the crickets – we needed to provide a way for people to leave something behind, something that marks progress on this gigantic collective task. You get to design a little house for each burrow, and your name gets recorded on the meadow until the next person takes over by watching more videos.


We’ve had plenty of conversations about what kind of people take part in this sort of citizen science activity, what the motivations may be. We ask a couple of questions when people sign up, and this is something we are interested in doing more research on in general for our projects. In this case, we are interested in depth of involvement more than attracting thousands of brief encounters – it only takes a few motivated people to make the researcher’s jobs much easier and provide some data they need.

For me a bigger objective of Cricket Tales is as a way to present more diverse and personal views of the world that surround us, and tends to go unnoticed. Being asked to contemplate a tiny organism’s view of the world for a minute can be quite an eye opener.

A 6502 lisp compiler, sprite animation and the NES/Famicom

For our new project “what remains”, we’re regrouping the Naked on Pluto team to build a game about climate change. In the spirit of the medium being the message, we’re interested in long term thinking as well as recycling e-waste – so in keeping with a lot of our work, we are unraveling the threads of technology. The game will run on the NES/Famicom console, which was originally released by Nintendo in 1986. This hardware is extremely resilient, the solid state game cartridges still work surprisingly well today, compared to fragile CDROM or the world of online updates. Partly because of this, a flourishing scene of new players are now discovering them. I’m also interested that the older the machine you write software for, the more people have access to it via emulators (there are NES emulators for every mobile device, browser and operating system).

Our NES with everdrive flashcart and comparatively tiny sdcard for storing ROMs.

These ideas combine a couple of previous projects for me – Betablocker DS also uses Nintendo hardware and although much more recent, the Gameboy DS has a similar philosophy and architecture to the NES. As much of the machines of this era, most NES games were written in pure assembly – I had a go at this for the Speccy a while back and while being fun in a mildly perverse way, it requires so much forward planning it doesn’t really encourage creative tweaking – or working collaboratively. In the meantime, for the weavingcodes project I’ve been dabbling with making odd lisp compilers, and found it very productive – so it makes sense to try one for a real processor this time, the 6502.

The NES console was one of the first to bring specialised processors from arcade machines into people’s homes. On older/cheaper 8 bit machines like the Speccy, you had to do everything on the single CPU, which meant most of the time was spent drawing pixels or dealing with sound. On the NES there is a “Picture Processing Unit” or PPU (a forerunner to the modern GPU), and an “Audio Processing Unit” or APU. As in modern consoles and PCs, these free the CPU up to orchestrate a game as a whole, only needing to sporadically update these co-processors when required.

You can’t write code that runs on the PPU or APU, but you can access their memory indirectly via registers and DMA. One of the nice things we can do if we’re writing a language for a compiling is building optimised calls that do specific jobs. One area I’ve been thinking about a lot is sprites – the 64 8×8 tiles that the PPU draws over the background tiles to provide you with animated characters.

Our sprite testing playpen using graphics plundered from Ys II: Ancient Ys Vanished.

The sprites are controlled by 256 bytes of memory that you copy (DMA) from the CPU to the PPU each frame. There are 4 bytes per sprite – 2 for x/y position, 1 for the pattern id and another for color and flipping control attributes. Most games made use of multiple sprites stuck together to get you bigger characters, in the example above there are 4 sprites for each 16×16 pixel character – so it’s handy to be able to group them together.

Heres an example of the the compiler code generation to produce the 6502 assembly needed to animate 4 sprites with one command by setting all their pattern IDs in one go – this manipulates memory which is later sent to the PPU.

(define (emit-animate-sprites-2x2! x)
   (emit-expr (list-ref x 2)) ;; compiles the pattern offset expression (leaves value in register a)
   (emit "pha")               ;; push the resulting pattern offset onto the stack
   (emit-expr (list-ref x 1)) ;; compile the sprite id expression (leaves value in a again)
   (emit "asl")               ;; *=2 (shift left)      
   (emit "asl")               ;; *=4 (shift left) - sprites are 4 bytes long, so = address
   (emit "tay")               ;; store offset calculation in y
   (emit "iny")               ;; +1 to get us to the pattern id byte position of the first sprite
   (emit "pla")               ;; pop the pattern memory offset back from the stack
   (emit "sta" "$200,y")      ;; sprite data is stored in $200, so add y to it for the first sprite
   (emit "adc" "#$01")        ;; add 1 to a to point to the next pattern location
   (emit "sta" "$204,y")      ;; write this to the next sprite (+ 4 bytes)
   (emit "adc" "#$0f")        ;; add 16 to a to point to the next pattern location
   (emit "sta" "$208,y")      ;; write to sprite 2 (+ 8 bytes)
   (emit "adc" "#$01")        ;; add 1 to a to point to the final pattern location
   (emit "sta" "$20c,y")))    ;; write to sprite 4 (+ 12 bytes)

The job of this function is to return a list of assembler instructions which are later converted into machine code for the NES. It compiles sub-expressions recursively where needed and (most importantly) maintains register state, so the interleaved bits of code don't interfere with each other and crash. (I learned about this stuff from Abdulaziz Ghuloum's amazing paper on compilers). The stack is important here, as the pha and pla push and pop information so we can do something completely different and come back to where we left off and continue.

The actual command is of the form:

(animate-sprites-2x2 sprite-id pattern-offset)

Where either arguments can be sub-expressions of their own, eg.:

(animate-sprites-2x2 sprite-id (+ anim-frame base-pattern))

This code uses a couple of assumptions for optimisation, firstly that sprite information is stored starting at address $200 (quite common on the NES as this is the start of user memory, and maps to a specific DMA address for sending to the PPU). Secondly there is an assumption how the pattern information in memory is laid out in a particular way. The 16 byte offset for the 3rd sprite is simply to allow the data to be easy to see in memory when using a paint package, as it means the sprites sit next to each other (along with their frames for animation) when editing the graphics:


You can find the code and documentation for this programming language on gitlab.

A tanglebots workshop report

I’ve tried a lot of different ways of teaching children programming, starting a few years ago with primary school children in a classroom, then doing inset training days for teachers and finally private tutoring in homes. For the finale to the weavingcodes project we are trying a new approach, teaching families about code, robotics and thread by building “tanglebots”.


The concept is to combine programming with physical objects, concentrating on sensor input and movement as output. It’s important that we incorporate our weavingcodes research process, so deliberately setting goals we don’t yet know the answers to.

The weaving focus allows us to ground the workshop in loom technology and demonstrate the challenges of manipulating thread, with its enormous history of technological development. For the first Cornwall workshop, Ellen started us off with an introduction using FoAM Kernow’s Harris loom and the fundamentals of weaving. We were also joined by Janet and Jon from lovebytes who are helping us to run these events. When first talking about possible workshops with children, we’d discussed the impossibility of making a functional loom in a couple of hours with only broken toys and lego – and so the focus on tangling was suggested by Alex as a way to turn these difficulties to an advantage. Similarly we created a series of prizes for different categories such as “Most technical effort with least impressive result” – inspired by hebocon events.



The workshop format we used is also influenced by Paul Granjon’s wrekshops – wherever possible we’re recycling by pulling apart e-waste, making use of electronics, motors, gears and ideas from the surprising complexity of what’s inside things people are throwing away. This turned out have a powerful implicit message about recycling, parents I talked to had tried taking things apart to learn about them, but the next step – making use of the parts discovered as we were doing here, needs a bit more help to do.

Also as normal for FoAM projects was the importance of the food, in this case tangled by Amber and Francesca to both provide sustenance and inspiration with cardamom knots, spiralised courgetti and tangle fritters.


The groups ended up a bit lopsided, so in future we plan to pre-arrange them as we did on the machine wilderness workshop. In order to do that we need to ask for more information from participants beforehand such as family ages and backgrounds.

We tried using the small Pi touchscreens – these were a bit too fiddly to get away without a mouse, but are much less oppressive somehow than larger PC monitors – as they are so small, they became incorporated into the tanglebots themselves.

Crocodile clips were the best way to connect to random/plundered electronics as well as the lego motors. These removed the need for soldering (which we had set up anyway, but in a separate space).

A selection of other notes we made:

  • Start with a manual tangling exercise (weaving with rope, tablets etc)
  • Lego has a strange all or nothing effect, once you start using it – everything has to work that way, avoiding it may lead to more creative options than including it
  • A first aid kit is needed for these sorts of things
  • The Pimoroni Explorer Hats are good but needed periodic resets in some cases – the motors seemed to get jammed, not sure if this is short circuits interrupting the i2c comms?
  • The Raspberry Pi docs are riddled with minor errors, e.g. the Scratch GPIO section on the explorer hats has a lot of sometimes confusing typos.

All our resources are being uploaded to the kairotic github repository so other people can make use of the materials.

As well as being supported by AHRC Digital Transformations, this project was part of British Science Week, supported by the British Science Association.