Some tangible programming language research (#2)

Following on from yesterday’s post – here’s a random selection of some (publicly accessible) tangible programming research.

A braille tangible programming instruction

The Algobrick language in use

The Algobrick language in use

Thinking outside of the screen (#1)

I’m starting a new exploratory project to build a screen-less programming language based on two needs:

  • A difficulty with teaching kids programming in my CodeClub where they become lost ‘in the screen’. It’s a challenge (for any of us really but for children particularly) to disengage and think differently – e.g. to draw a diagram to work something out or work as part of a team.
  • A problem with performing livecoding where a screen represents a spectacle, or even worse – a ‘school blackboard’ that as an audience we expect ourselves to have to understand.

I’ve mentioned this recently to a few people and it seems to resonate, particularly in regard to a certain mismatch of children’s ability to manipulate physical objects against their fluid touchscreen usage. So, with my mind on the ‘pictures under glass’ rant and taking betablocker as a starting point (and weaving code as one additional project this might link with), I’m building some prototype hardware to provide the Raspberry Pi with a kind of external physical memory that could comprise symbols made from carved wood or 3D printed shapes – while still describing the behaviour of real software. I also want to avoid computer vision for a more understandable ‘pluggable’ approach with less slightly faulty ‘magic’ going on.

Before getting too theoretical I wanted to build some stuff – a flexible prototype for figuring out what this sort of programming could be. The Raspberry Pi has 17 configurable I/O pins on it’s GPIO interface, so I can use 5 of them as an address lookup (for 32 memory locations to start with, expandable later) and 8 bits as input for code or data values at these locations.

The smart thing would be to use objects that identify themselves with a signal, using serial communication down a single wire with a standard protocol. The problem with this is that it would make potential ‘symbol objects’ themselves fairly complicated and costly – and I’d like to make it easy and cheap to make loads of them. For this reason I’m starting with a parallel approach where I can just solder across pins on a plug to form a simple 8 bit ID, and restrict the complexity to the reading hardware.

Testing the 74HC4067 16-channel analog mult iplexer/demultiplexer

Testing the 74HC4067 16-channel multiplexer/demultiplexer

I’ve got hold of a bunch of 74HC4067 multiplexers which allow you to select one signal from 16 inputs (or the other way around), using 4 bits – and stacking them up, one for each 8 bits X 16 memory locations. This was the furthest I could go without surface mount ICs (well out of my wonky soldering abilities).

Building the board, (with narrow 24 pin IC holders sliced down the middle). The input comes in via a common bus down the centre of the board.

Building the board, (with narrow IC holders hacked by slicing them down the middle). The input comes in via a common bus down the centre of the board.

Solder practice

Solder practice

Testing the first 4 bits on the breadboard

Testing the first 4 bits on the breadboard

Now 4 bits are working it’s harder to test with an LED – so next up is getting the Raspberry Pi attached.

Bumper Crop released

A release of Bumper Crop is now up on the play store with the source code here. As I reported earlier this has been about converting a board game designed by farmers in rural India into a software version – partly to make it more easily accessible and partly to explore the possibilities and restrictions of the two mediums. It’s pretty much beta really still, as some of the cards behave differently to the board game version, and a few are not yet implemented – we need to work on that, but it is playable now, with 4 players at the same time.

Screenshot_2014-08-21-23-45-51

The 3D and animation is done using the fluxus engine on android, and the game is written in tinyscheme. Here’s a snippet of the code for one of the board locations, I’ve been experimenting with a very declarative style lately:

;; description of location that allows you to fertilise your crops
;; the player has a choice of wheat/onion or potatoes
(place 26 'fertilise '(wheat onion potato) 
  ;; this function takes a player and a 
  ;; selected choice and returns a new player
  (lambda (player choice)
    (if (player-has-item? player 'ox) ;; do we have an ox?
      ;; if so, a complete a free fertilise task if needed
      (if (player-check-crop-task player choice 'fertilise 0)
        (player-update-crop-task player choice 'fertilise)
        player)
      ;; otherwise it costs 100 Rs
      (if (player-check-crop-task player choice 'fertilise 100)
        (player-update-crop-task
          (player-add-money player -100) ;; remove money
            player choice 'fertilise)
          player)))
  (place-interface-crop)) ;; helper to make the interface

Testing the board game, which you can download on this page:

boardgame

The game on tablet:

Screenshot_2014-08-21-23-30-41

Screenshot_2014-08-21-23-40-04

This is the game running on a phone:

Screenshot_2014-08-22-10-25-54

Neural Network livecoding and retrofitting ZX Spectrum hardware

An experimental, and quite angry neural network livecoding synth (with an audio ‘weave’ visualisation) for the ZX Spectrum: source code and TZX file (for emulators). It’s a bit hard to make out in the video, but you can move around the 48 neurons and modify their synapses and trigger levels. There are two clock inputs and the audio output is the purple neuron at the bottom left. It allows recurrent loops as a form of memory, and some quite strange things are possible. The keys are:

  • w,d: move diagonally north west <-> south east
  • s,e: move diagonally south west <-> north east
  • t,y,g,h: toggle incoming synapse connections for the current neuron
  • space: change the ‘threshold’ of the current neuron (bit shifts left)

This audio should load up on a real ZX Spectrum:

One of the nice things about tech like this is that it’s easily hackable – this is a modification to the video output better explained here, but you can get a standard analogue video signal by connecting the internal feed directly to the plug and detaching the TV signal de-modulator with a tiny bit of soldering. Look at all those discrete components!

IMG_20140809_125815

dBsCode summer school

At the end of July I helped out with the dbscode summer school. The idea of this two week course was to encourage algorithmic literacy, with focus on employment – agile methods and test driven development (TDD), and aiming at people about to enter, or re-enter employment rather than the teenagers we focused on in Easter. We were interested in teaching the culture the participants will encounter in modern software development, and this was driven by Cornish embedded technology company Bluefruit and John Jagger – consultant and creator of Cyber-Dojo. We had 9 participants from a mix of backgrounds, some recently graduated students and some experienced programmers wanting to catch up with software engineering practice.

IMG_20140725_134656

We set up teams and provided a tricky example project using Raspberry Pi and an accelerometer sensor with the aim to develop a prototype to capture the movement of fishing casts, in the context of those already used for sports such as tennis or golf. The great thing about this problem is that it spans the entire range of software, from bit shifting and binary operations to extract sensor data from a device using the i2c protocol, all the way up to graphing in php/javascript, and all the storage, processing and networking in between. We tried hard to set the scene and atmosphere like a software company, and the feedback from the recently graduated students was (rather worryingly) that this was a totally different approach to that currently taught in colleges and universities.

IMG_20140725_135224

We mixed this group challenge with Cyber-Dojo, which meant we could do little 45 minute programming exercises each day. My observations, based on sporadic visits throughout the two weeks – were that one of the biggest surprises, particularly at the start, was that the level of improvisation and experimentation (rather than already ‘having all the answers’) was a key part of professional practice, rather than something they should avoid or feel embarrassed about. The focus on TDD helped very much with this as well as doing a project that we as teachers hadn’t tried before – this I feel is key to providing learning about how to learn rather than an overly didactic (and not terribly realistic) experience.

IMG_20140725_135247

IMG_20140725_135913

IMG_20140725_135958

Computation is weaving

With my mind on our upcoming AHRC weave/code project (and seeing as Alex has already started writing code) I thought I’d have a go at visualising how computers work in relation to pattern manipulation. These screenshots are from a ZX Spectrum where I’ve modified some library assembler code for higher level arithmetic to display the contents of 7 CPU z80 registers graphically between each instruction – time runs from top to bottom.

Most processors don’t actually have circuits for mathematics, they simply implement ‘add’ along with bitwise instructions for ‘and’, ‘or’, ‘not’, ‘xor’ and a handful of instructions for shifting the bits left and right. This is true even with modern CPU’s where the arithmetic instructions for multiply, divide etc are built with hidden ‘microcode’ routines. For this reason the underlying operation of a computer has more to do with patterns than it does with concepts such as language or even numbers as we normally think of them.

The simplest (and shortest) are multiply in 8 bits. In this function, the ‘a’ register contains one number and the ‘h’ register contains the other – at the end the ‘a’ register contains the result. In the first screenshot the numbers are fairly simple so it’s possible to see what’s going on (ie. in 1*1 the ‘a’ and ‘h’ registers both contain 00000001)
mul8

170 in 8 bits looks like ’10101010′ so easy to see – here are some different ways of reaching the same answer:
mul8-170

16bit multiply operates over 2 registers – the first value is stored in ‘h’ and ‘l’ and the other is on the stack, but is loaded into ‘d’ and ‘e’ after a few instructions:
mul16

43690 is ’1010101010101010′ so in the first run here we multiply it by one as a test:
mul16-2

Some 16 bit divides – these take a longer time to calculate, so a whole page for all the instructions involved, and I have no idea how this works:
div16

65535 is the largest value we can store, divide by itself to end up with 1:
div16-2

div16-3

The code for all this is here.

Mongoose 2000: Group composition

I’ve recently been building the Mongoose 2000 “group composition” tool that the researchers will use for recording information about a whole pack of mongooses (and synchronise data via a Raspberry Pi providing a local wifi node) in their field site in Uganda. As I wrote a bit about before, one of the interesting things about this project is that the interface design has to focus on long term speed and flexibility over immediate ease of use. In this way it seems appropriate that it’s moving in the direction of a musical interface rather than a normal touch screen interface. The different colours in the mongoose selectors show which individuals are present and which have data recorded from them already, the screenshot below is the section where they record relationships between the adult females (at the top) and adult males that may be guarding – or pestering them (below). At the same time, they need to be able to record events that may be occurring with the pack as a whole – in this case an interaction with another pack of mongeese.

Screenshot_2014-06-26-11-34-40

Evolving butterflies game released!

toxic

The Heliconius Butterfly Wing Pattern Evolver game is finished and ready for it’s debut as part of the Butterfly Evolution Exhibit at the Royal Society Summer Exhibition 2014. Read more about the scientific context on the researcher’s website, and click the image above to play the game.

The source code is here, it’s the first time I’ve used WebGL for a game, and it’s using the browser version of fluxus. It worked out pretty well, even to the extent that the researchers could edit the code themselves to add new explanation screens for the genetics. Like any production code it has niggles, here’s the function to render a butterfly:

(define (render-butterfly s)
  (with-state
   ;; set tex based on index
   (texture (list-ref test-tex (butterfly-texture s)))  
   ;; move to location
   (translate (butterfly-pos s))                        
   ;; point towards direction
   (maim (vnormalise (butterfly-dir s)) (vector 0 0 1)) 
   (rotate (vector 0 90 90))      ;; angle correctly
   (scale (vector 0.5 0.5 0.5))   ;; make smaller
   (draw-obj 4)                   ;; draw the body
   (with-state          ;; draw the wings in a new state
    (rotate (vector 180 0 0))                         
    (translate (vector 0 0 -0.5))  ;; position and angle right
    ;; calculate the wing angle based on speed
    (let ((a (- 90 (* (butterfly-flap-amount s)         
                      (+ 1 (sin (* (butterfly-speed s)  
                                   (+ (butterfly-fuzz s) 
                                      (time)))))))))
      (with-state
       (rotate (vector 0 0 a))
       (draw-obj 3))              ;; draw left wing
      (with-state
       (scale (vector 1 -1 1))    ;; flip
       (rotate (vector 0 0 a))
       (draw-obj 3))))))          ;; draw right wing

There is only immediate mode rendering at the moment, so the transforms are not optimised and little things like draw-obj takes an id of a preloaded chunk of geometry, rather than specifying it by name need to be fixed. However it works well and the thing that was most successful was welding together the Nightjar Game Engine (HTML5 canvas) with fluxus (WebGL) and using them together. This works by having two canvas elements drawn over each other – all the 2D (text, effects and graphs) are drawn using canvas, and the butterflies are drawn in 3D with WebGL. The render loops are run simultaneously with some extra commands to get the canvas pixel coordinates of objects drawn in 3D space.