Building a screenless programming language for the Raspberry Pi #6

Following on from the last in this mini series, the giant 16 byte multiplexer is finished and tested – lots of head scratching due to shorts and wonky solder joints (it’s amazing how sensitive CMOS logic is, interference on accidental free pins causes strange effects) and I’ve replaced the breadboard with a new interface board between the Pi and the multiplexer.

IMG_20140914_163350

Now it’s time to think more about the ‘front end’ – here are some prototype programming blocks made from the ‘holes’ from a hole saw (I’m chopping up bits of driftwood). These include a central drill hole, which we can use for an indicator LED.

IMG_20140914_155709

This is the minimal circuitry needed for each programming block – it’s just a 4 bit identification code (reconfigurable via jumpers) and the LED, mounted around the 7 pin VGA style plug.

IMG_20140914_155742

Here it’s mounted in the block after a bit of hollowing out. To be fully kid-proof I’ll probably eventually set the internals in epoxy glue.

IMG_20140914_162353

Testing the plug/socket wiring on the breadboard. The plan is to paint the top surface of the programming block in blackboard paint so different instructions and languages can be tested and built with the same blocks. It also intriguingly makes abstraction possible, which is usually a problem with non-text based approaches, but this needs more thinking about.

IMG_20140916_080606

Here it is with another unenclosed block attached to the main board, and via the new interface to the Raspberry Pi. The interface will need a bit more circuitry to drive the LEDs – which will be able to be lit in different ways depending on the needs of the language, perhaps sequential in some cases, more dependant on logic in others. Also, different shaped blocks or different coloured LEDs may have different meanings in a language too.

Hindi translations in Symbai

A couple of screenshots of the hindi version of Symbai – our solar powered Raspberry Pi/Android anthropological research tool. As is usual we’re still having a few issues with the unicode but it’s nearly there. We’ve been working on this software for the last few months, making sure the data (including photos and audio recordings of verbal agreements) synchronise properly across multiple devices.

Screenshot_2014-09-15-10-45-14

Screenshot_2014-09-15-10-45-45

Screenless programming language #5

Update after the last post. After many hours soldering 75% of the board is complete! The Pi can now address a whopping 12 bytes of data.

IMG_20140912_085034

After thinking about it for a while, and checking with the languages I’m planning to use – I’ve decided to double the number of instruction ‘slots’ by halving the bits to 4. This doesn’t actually change the core hardware at all – I’ll just scan two at a time and split them in software (part of this project has been to see how much easier it is to change software than hardware). Some of the instructions can span two addresses if needed, but this makes more efficient use of the resources. The ports on the left of the board are where the plugs for the programming ‘interface’ will come in – each one will now split to two locations.

Here’s my test GPIO software running on the Pi, constant testing has been the only way to stay sane on this project. Eventually I could replace the Pi with a microcontroller for some applications, but the Pi has been a great way to ease the prototype process.

IMG_20140910_092642

Screen-less programming language (#3)

IMG_20140903_131424

Since the last update I’ve connected the Raspberry Pi to the board, and after a bit of debugging I have a python script that polls bytes (more accurately 4 bit ‘nibbles’) via the GPIO ports from 16 addresses. I didn’t blow up the Pi, although I did cause hard resets a couple of times with wandering connections. Now comes the mega wiring phase.

Robot nightjar eggshibition at the Poly, Falmouth

As part of this year’s Fascinate festival we took over the bar at Falmouth’s Poly with visualisations of the camouflage pattern evolution process from the egglab game.

IMG_20140828_103921

tree-125-fs-part

This was a chance to do some detective work on the massive amount of genetic programming data we’ve amassed over the last few months, figure out ways to visualise it and create large prints of the egg pattern generation process. I selected family trees of eggs where mutations caused new features that made them difficult for people to spot, and thus resulted in large numbers of descendants. Then I printed examples of the eggs at different stages to see how they progressed through the generations.

IMG_20140828_104022

We also ran the egglab game in the gallery on a touch screen which accidentally coincided with some great coverage in the Guardian and Popular Science, but the game kept running (most of the time) despite this.

IMG_20140829_151430

IMG_20140829_152451

IMG_20140830_112644

The Poly (or Royal Cornwall Polytechnic Society) was really the perfect place for this exhibition, with its 175 year history of promoting scientists, engineers and artists and encouraging innovation by getting them together in different ways. Today this seems very modern (and would be given one of our grand titles like ‘cross-displinary’) but it’s quite something to see that in a lot of ways the separation between these areas is currently bigger than it ever has been, and all the more urgent because of this. The Poly has some good claims to fame, being the first place Alfred Nobel demonstrated nitro‐glycerine in 1865! Here are some pages from the 1914 report, a feel for what was going on a century ago amongst other radical world changes:

IMG_20140830_105017

IMG_20140830_104857

Some tangible programming language research (#2)

Following on from yesterday’s post – here’s a random selection of some (publicly accessible) tangible programming research.

A braille tangible programming instruction

The Algobrick language in use

The Algobrick language in use

Thinking outside of the screen (#1)

I’m starting a new exploratory project to build a screen-less programming language based on two needs:

  • A difficulty with teaching kids programming in my CodeClub where they become lost ‘in the screen’. It’s a challenge (for any of us really but for children particularly) to disengage and think differently – e.g. to draw a diagram to work something out or work as part of a team.
  • A problem with performing livecoding where a screen represents a spectacle, or even worse – a ‘school blackboard’ that as an audience we expect ourselves to have to understand.

I’ve mentioned this recently to a few people and it seems to resonate, particularly in regard to a certain mismatch of children’s ability to manipulate physical objects against their fluid touchscreen usage. So, with my mind on the ‘pictures under glass’ rant and taking betablocker as a starting point (and weaving code as one additional project this might link with), I’m building some prototype hardware to provide the Raspberry Pi with a kind of external physical memory that could comprise symbols made from carved wood or 3D printed shapes – while still describing the behaviour of real software. I also want to avoid computer vision for a more understandable ‘pluggable’ approach with less slightly faulty ‘magic’ going on.

Before getting too theoretical I wanted to build some stuff – a flexible prototype for figuring out what this sort of programming could be. The Raspberry Pi has 17 configurable I/O pins on it’s GPIO interface, so I can use 5 of them as an address lookup (for 32 memory locations to start with, expandable later) and 8 bits as input for code or data values at these locations.

The smart thing would be to use objects that identify themselves with a signal, using serial communication down a single wire with a standard protocol. The problem with this is that it would make potential ‘symbol objects’ themselves fairly complicated and costly – and I’d like to make it easy and cheap to make loads of them. For this reason I’m starting with a parallel approach where I can just solder across pins on a plug to form a simple 8 bit ID, and restrict the complexity to the reading hardware.

Testing the 74HC4067 16-channel analog mult iplexer/demultiplexer

Testing the 74HC4067 16-channel multiplexer/demultiplexer

I’ve got hold of a bunch of 74HC4067 multiplexers which allow you to select one signal from 16 inputs (or the other way around), using 4 bits – and stacking them up, one for each 8 bits X 16 memory locations. This was the furthest I could go without surface mount ICs (well out of my wonky soldering abilities).

Building the board, (with narrow 24 pin IC holders sliced down the middle). The input comes in via a common bus down the centre of the board.

Building the board, (with narrow IC holders hacked by slicing them down the middle). The input comes in via a common bus down the centre of the board.

Solder practice

Solder practice

Testing the first 4 bits on the breadboard

Testing the first 4 bits on the breadboard

Now 4 bits are working it’s harder to test with an LED – so next up is getting the Raspberry Pi attached.

Bumper Crop released

A release of Bumper Crop is now up on the play store with the source code here. As I reported earlier this has been about converting a board game designed by farmers in rural India into a software version – partly to make it more easily accessible and partly to explore the possibilities and restrictions of the two mediums. It’s pretty much beta really still, as some of the cards behave differently to the board game version, and a few are not yet implemented – we need to work on that, but it is playable now, with 4 players at the same time.

Screenshot_2014-08-21-23-45-51

The 3D and animation is done using the fluxus engine on android, and the game is written in tinyscheme. Here’s a snippet of the code for one of the board locations, I’ve been experimenting with a very declarative style lately:

;; description of location that allows you to fertilise your crops
;; the player has a choice of wheat/onion or potatoes
(place 26 'fertilise '(wheat onion potato) 
  ;; this function takes a player and a 
  ;; selected choice and returns a new player
  (lambda (player choice)
    (if (player-has-item? player 'ox) ;; do we have an ox?
      ;; if so, a complete a free fertilise task if needed
      (if (player-check-crop-task player choice 'fertilise 0)
        (player-update-crop-task player choice 'fertilise)
        player)
      ;; otherwise it costs 100 Rs
      (if (player-check-crop-task player choice 'fertilise 100)
        (player-update-crop-task
          (player-add-money player -100) ;; remove money
            player choice 'fertilise)
          player)))
  (place-interface-crop)) ;; helper to make the interface

Testing the board game, which you can download on this page:

boardgame

The game on tablet:

Screenshot_2014-08-21-23-30-41

Screenshot_2014-08-21-23-40-04

This is the game running on a phone:

Screenshot_2014-08-22-10-25-54

Neural Network livecoding and retrofitting ZX Spectrum hardware

An experimental, and quite angry neural network livecoding synth (with an audio ‘weave’ visualisation) for the ZX Spectrum: source code and TZX file (for emulators). It’s a bit hard to make out in the video, but you can move around the 48 neurons and modify their synapses and trigger levels. There are two clock inputs and the audio output is the purple neuron at the bottom left. It allows recurrent loops as a form of memory, and some quite strange things are possible. The keys are:

  • w,d: move diagonally north west <-> south east
  • s,e: move diagonally south west <-> north east
  • t,y,g,h: toggle incoming synapse connections for the current neuron
  • space: change the ‘threshold’ of the current neuron (bit shifts left)

This audio should load up on a real ZX Spectrum:

One of the nice things about tech like this is that it’s easily hackable – this is a modification to the video output better explained here, but you can get a standard analogue video signal by connecting the internal feed directly to the plug and detaching the TV signal de-modulator with a tiny bit of soldering. Look at all those discrete components!

IMG_20140809_125815