All posts by dave

Dyadic device: a 4 shaft loom simulation.

On the train back from the Sheffield codingweaves workshops back in October I wrote a quick browser program to attempt to further understand the relationship between structure and pattern in weaving – which I’ve put online here. This works in the inverse of how we’ve been writing weaving simulation programs so far. Instead of defining the pattern you want directly, you are describing the set up of a 4 shaft loom – so the warp threads that each of 4 shafts pick up in the top row of toggle boxes, then which shafts are picked up for each weft thread as the fabric is woven on the right.

This involved writing a program that is based closely on how a loom functions – for example calculating a shed (the gap between ordered warp thread) by folding over each shaft in turn and or-ing each warp thread to calculate which ones are picked up. This really turns out to be the core of the algorithm – here’s a snippet:

;; 'or's two lists together:
;; (list-or (list 0 1 1 0) (list 0 0 1 1)) => (list 0 1 1 1)
(define (list-or a b)
  (map2
   (lambda (a b)
     (if (or (not (zero? a)) (not (zero? b))) 1 0))
   a b))

;; calculate the shed, given a lift plan position counter
;; shed is 0/1 for each warp thread: up/down
(define (loom-shed l lift-counter)
  (foldl
   (lambda (a b)
     (list-or a b))
   (build-list (length (car (loom-heddles l))) (lambda (a) 0))
   (loom-heddles-raised l lift-counter)))

I’ve become quite obsessed with this program, spending quite a lot of time with it trying to understand how the loom setup corresponds to the patterns. Here are some example weaves that you can try. Colour wise, in all these examples the order is fixed – both the warp and the weft alternate light/dark yarns.

tabby

This is tabby or plain weave – the simplest and strongest weave (used for sails and hard wearing fabric). The striped pattern is a result of this alternating colour order.

basket

Basket weave – doubling the plain weave, results in a zigzag pattern.

twill

This is called 2×2 twill, the structure provides a stretchy fabric, often used for clothes. Notice that the pattern in the same as the basket weave even though the structure is different – this is a hint at how structure and pattern are linked in strange ways (it gets much more complex than this of course).

boxy

I’ve become very interested in this threading pattern for the shafts as it results in lots of interesting patterns. This is an example of connected boxes.

meander

A slight shift and we can obtain meanders, an important motif of the kairotic project. It turns out this is a highly unstable structure due to the length of the ‘floats’ – the threads spending too long without being incorporated back into the weave. More on that later on.

freestyle

Here’s a more freeform weave where I switch patterns between different types by changing the lift order. Much like music, it’s possible to switch patterns in a nice way that doesn’t interrupt the flow.

Next up is building a real loom to try these patterns out in thread form.

Some algorithms for Wild Cricket Tales

On the Wild Cricket Tales citizen science game, one of the tricky problems is grading player created data in terms of quality. The idea is to get people to help the research by tagging videos to measure behaviour of the insect beasts – but we need to accept that there will be a lot of ‘noise’ in the data, how can we detect this and filter it away? Also it would be great if we can detect and acknowledge players who are successful at hunting out and spotting interesting things, or people who are searching through lots of videos. As we found making the camouflage citizen science games, you don’t need much to grab people’s attention if the subject matter is interesting (which is very much the case with this project), but a high score table seems to help. We can also have one per cricket or burrow so that players can more easily see their progress – the single egglab high score table got very difficult to feature on after a few thousand players or so.

We have two separate but related problems – acknowledging players and filtering the data, so it probably makes sense if they can be linked. A commonly used method, which we did with egglab too (also for example in Google’s reCAPTCHA which is also crowdsourcing text digitisation as a side effect) is to get compare multiple people’s results on the same video, but then we still need to bootstrap the scoring from something, and make sure we acknowledge people who are watching videos no one has seen yet, as this is also important.

Below is a simple naive scoring system for calculating a score simply by quantity of events found on a video – we want to give points for finding some events, but over some limit we don’t want to reward endless clicking. It’s probably better if the score stops at zero rather than going negative as shown here, as games should never really ‘punish’ people like this!

quantity-score

Once we have a bit more data we can start to cluster events to detect if people are agreeing. This can give us some indication of the confidence of the data for a whole video, or a section of it – and it can also be used to figure out a likelihood of an individual event being valid using the sum of neighbouring events weighted by distance via a simple drop-off function.

quality-score

If we do this for all the player’s events over a single video we can get an indication of how consistent they are with other players. We could also recursively weight this by a player’s historical scores – so ‘trusted’ players could validate new ones – this is probably a bit too far at this point, but it might be an option if we pre-stock some videos with data from the researchers who are trained with what is important to record.

Connecting Flask to Apache

When exploring the cosmos of python powered web frameworks it’s easy to get lost due to fast moving versions. Recently we had a lot of difficulty getting Flask connected via a WSGI script on Apache. This seems to be due to needing the object factory to be called specifically rather than relying on the module namespace to do the work for us, there doesn’t seem to be much help online about this yet. We’re also running python in a virtualenv so the working wsgi script looks like this:

#!/usr/bin/python                                                               
import sys
import logging
import os

logging.basicConfig(stream=sys.stderr)
PROJECT_DIR = "/var/www/yourflasksite/"
activate_this = os.path.join(PROJECT_DIR, 'bin', 'activate_this.py')
execfile(activate_this, dict(__file__=activate_this))

sys.path.append(PROJECT_DIR)
from app import create_app
application=create_app(os.getenv('FLASK_CONFIG') or 'default')
application.secret_key = 'your site's key'

For completeness, here is the apache config that starts it up:

<VirtualHost *:80>
    ServerName your.url
    WSGIScriptAlias / /var/www/yourflasksite/yoursite.wsgi
    <Directory /var/www/yourflasksite/app>
	Order allow,deny
        Allow from all
    </Directory>
    Alias /static /var/www/yourflasksite/app/static
    <Directory /var/www/yourflasksite/app/static/>
        Order allow,deny
        Allow from all
    </Directory>
    ErrorLog ${APACHE_LOG_DIR}/yourflasksite-error.log
    LogLevel warn
    CustomLog ${APACHE_LOG_DIR}/yourflasksite-access.log 
</VirtualHost>

Visual programming for environmental research with UAVs

ua3

I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?

Enumerating and displaying all the sensors on a phone

Enumerating and displaying all the sensors on a phone

This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.

Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.

Nested drag/droppable code which gets converted to Scheme code for the tinyscheme interpreter

Nested drag/droppable code which gets converted to Scheme for the tinyscheme interpreter

A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense - just testing!)

A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense – just testing!)

The "Hello World" program, displays every 3 seconds  (even when the app is running in the background)

The "Hello World" program, displays every 3 seconds (even when the app is running in the background)

Adventures with i2c

In order to design the next version of the flotsam hardware I need to make it cheaper and easier to build. The existing hardware was very cheap in terms of components but expensive in terms of time it took to construct! With this lesson learned and with a commission on the horizon I need to find a simpler and more flexible approach to communication between custom hardware and the Raspberry Pi – mainly one that doesn’t require so many wires.

i2c is a serial communication standard used by all kinds of components from sensors to LCD displays. The Raspberry Pi comes with it built in, as the Linux kernel supports it along with various tools for debugging. I also have some Atmel processors from a previous project and there is quite a bit of example code for them. I thought I would post a little account of my troubleshooting on this for others that follow the same path, as I feel there is a lot of undocumented knowledge surrounding these slightly esoteric electronics.

The basic idea of i2c is that you can pass data between a large number of independent components using only two wires to connect them all. One is for data, the other is for a clock signal used for synchronisation.

Debugging LEDs

Debugging LEDs

I started off with the attiny85 processor, mainly as it was the first one I found, along with this very nice and clear library. I immediately ran into a couple of problems, one was that while it has support for serial communication built in (USI) you have to implement i2c on top of this so your code needs to do a lot more. The second was with only 8 pins the attiny85 is not great for debugging. I enabled i2c on the Raspberry Pi and hooked it up, ran i2cdetect – no joy. After a lot of fiddling around with pull up resistors and changing voltages between 3 and 5v either no devices were detected, or all of them were, all reads returning 0 (presumably logic pulled high for everything) or noise – nothing seemed to make any difference.

After a while (and trying other i2c slave libraries to no avail) I switched to an atmega328 processor using this library which includes a Makefile! One thing that I’ve noticed that would make things much easier for learning this stuff is more complete toolchains in example code including the right #defines and fuse settings for the processor. However this code didn’t work either at first, and my attempts at using debugging LEDs on PORTB didn’t work until I figured out it was conflicting with the UART i/o used in the example code – after figuring out that this wasn’t part of the i2c code I removed it and the Raspberry Pi could at last see the device with i2cdetect. With the addition of some LEDS I could check that bytes being sent were correctly being written to the internal buffer at the correct addresses.

It finally works!

It finally works!

Reading was another matter however. Most of the time i2cget on the Pi failed, and when it did work it only returned 0x65 no matter what the parameters. I’d already read extensively about the Raspberry Pi’s i2c clock stretching bug and applied various fixes which didn’t seem to make any difference. What did the trick was to remove the clock divide on the atmega’s fuses – by default it runs at 8Mhz but slows the instruction cycles to 1Mhz – without that it could keep up with the Pi’s implementation and all reads were successful. I still had to solve the ‘0x65 problem’, and went into the i2c code to try and figure out what was going on (using 8 LEDs to display the i2c status register). It seems like reading single bytes one at a time is done by issuing a TW_ST_DATA_NACK as opposed to TW_ST_DATA_ACK, as it sends a not-acknowledged for the last byte read. This state is not supported by the library, after fiddling around with it a bit I switched over on the Pi’s side to using the python smbus library, and tried using read_i2c_block_data – which reads 32 bytes at a time. The first byte is still 0x65 (101 in decimal, in the photo above) – but the rest are correct, I’ll need to read a bit more on the i2c protocol to figure that one out (and get the attiny working eventually), but at least it’s enough for what I need now.

Handy collection of pinouts

Handy collection of pinouts

Tangible livecoding tests in the wild, and material as type in programming

Last week I took the flotsam tangible livecoding system to my programming tutoring lesson for some first tests with the real experts. To provide some background, we started a while back with Raspberry Pi, messing around with the Minecraft API and python and we’ve recently moved on to laptops and pygame. I arrived with the system set up for the Minecraft building language, and we gave it 10 minutes or so before resuming normal activities – although it did get a bit more use during natural breaks in the lesson. Here’s a recent pic of the weaving l-system setup:

tangibleweaves

First impressions were that it was immediately playful, most of the blocks were eagerly removed and examined before we’d turned the thing on. One problem with this was that the chalk symbols rubbed off very quickly! A similarity with the Scratch programming language was also picked up on fairly soon.

A big issue was getting the connectors the right way round – this is not easy as the blocks are circular with little indication of which way is ‘up’. This could be fixed by altering the shape or cutting a little notch – learning how to manipulate them is part of the point, but right now it’s too difficult. This will also be a big problem in livecoding performance situations.

Using Minecraft, the screen was still a distraction. The connection between what’s happening in the 3D world and with the physical blocks is not obvious enough – even with the LED indicators. Also the mouse and keyboard need to be plugged in so we can move the camera around and see stuff, leading to a few too many things going on. I’m not sure how this can be solved regarding Minecraft, but indicates that a musical approach (with no screen or any other peripherals needed other than speakers) is the way to go.

Overall there is huge potential to think much more about the touch/feel of the blocks. Lots of these problems can be solved by using different shapes for different symbols (removing the need for chalk) with a clear orientation. Using different materials to provide textures and ‘feel’ in order to represent different sounds, or in terms of weaving, using the actual yarn material to represent itself is a huge area to explore.

materials

In the photo above, I’m using thick and thin blue yarn wrapped around the blocks that represent them, and tinfoil for l-system rule symbols (X, Y and Z) – this is a kind of material based type indication. Interestingly the yarn feels very different even though it looks the same in the photo. In use this results in much less checking of the block when you pick them up, as your fingers tell you what they are.

dremelling

Weaving notations

I’ve collected images from our workshops in Leeds and Sheffield as we attempted to understand the intricacies of weaving, particularly ancient looms from antiquity.

A simple visualisation of a loom.

A simple visualisation of a loom

Attempting to understand the relationship between lift plans and heddles.

Attempting to understand the relationship between lift plans and heddles.

The connection between tablet woven bands, whose weft form the warp of the warp-weighted loom.

The connection between tablet woven bands, whose weft form the warp of the main weaving.

A pattern language, turning into binary pattern, then thinking about the types of patterns the weavers of antiquity favoured.

A pattern language, turning into binary pattern, then thinking about the types of patterns the weavers of antiquity favoured.

More tablet/warp weighted calculations, grouping threads by colour.

More tablet/warp weighted calculations, grouping threads by colour.

A professional lift plan, by Leslie Downes.

A professional lift plan, by Leslie Downes.

Learning about thinking and weaving in Leeds and Sheffield

The second week of intense work on weavingcodes/codingweaves took place in the north of England, and began with a talk at the New Mechanics Institute show and tell meeting in Leeds. This was a good place to pick up from the previous week in Denmark as it included a talk on pixel art from the early days to contemporary forms that had many similarities with our slides.

IMG_20141020_182924

The first half of the week was spent in Leeds University at the Interdisciplinary Centre for Scientific Research in Music where we met Tim Ingold who’s books (e.g. Making: Anthropology, Archaeology, Art and Architecture) have been a big influence on the project. One of the things we discussed with Tim was the cyberspace myth – the idea that computing is intangible and exists in some “other world” (or in a “cloud” somewhere) which is increasingly problematic. The need to highlight the physical basis of computation is one that this project is well placed to address – and if we consider weaving to be a form of computation, then it may be able to inspire approaches to programming that have at least three thousand years of history behind them. Tim also discussed knowledge as movement, his understanding of tacit knowledge – that which is otherwise unwritten or unspoken. Tacit knowledge is fluid and exists underneath a counterpart, called ‘articulated knowledge’ which is fixed by definitions. Culturally we have a bias towards articulated knowledge, and the idea of an unspoken knowledge is something which I realise I am trying to deal with in my teaching activities, but I’ve never been able to suitably describe it.

On day 2 we met with Leslie Downes – a retired professional weaver, who’s woven structures have gone into space, been used in jet engines and bullet proof vests.

Alex and Ellen talking with Leslie Downes IMG_20141021_094720

There are many places where woven structures are superior to other kinds of manufacturing, but at the same time there is a lack of knowledge about weaving technology in engineering in general. Industrial fabric weaving has focused on increasing speed and production output while the kind of looms needed for materials like carbon fibre need to focus on precision. Leslie talked of occasions where he had to return to using hand looms to try out prototypes and how simulations of fibre interactions in weaving exist, but are unnecessary and deficient compared with prototyping with the actual materials. In weaving, as in computing, older tends to mean less restricted, as well as slower. New developments add to, rather than eclipse older ones.

The main feeling I had from Leslie was that his approach to technology came from a combination of an ability to reason about material based on a deep understanding of the capabilities of the looms themselves, how they can be adapted for each job (e.g. a project where he had to cut down the middle of a huge loom to remove a metre or so) mixed with experience in working with people to understand their needs. This generalist approach to knowledge is so lacking in society that it becomes highly regarded.

For the second part of the week we moved to Sheffield, and had breakfast with Luigina Ciolfi researcher in Human-Centred Computing. Lui’s input was both valuable and timely, as she pointed out that one of the key elements of interest is that nature of our collaboration, rather than any products we may or may not come up with. As such we need to be careful that we take the time to document our process and decision making.

After breakfast we headed into the woods – to a temporary base in The J G Graves Woodland Discovery Centre to shift into plotting and planning mode.

In the woods looking at the kinds of weaves used in antiquity

Emma Cocker joined us again, and helped to restrengthen the kairotic and poetic roots of our thinking and we were visited by the lovebytes crew to discuss the use of weaving and livecoding for teaching children. I set up the flotsam tangible programming prototype for it’s first ever use by anyone other than me. This involved a bit of code review/pair programming with Ellen in order to fix bugs in the weave generating algorithm.

Ellen programming tangibly

We face challenges with our collaboration in both directions between our practices. On the one hand we have normal software decisions, is what we make going to be online or offline? Do we prioritise flexibility or accessibility for our choices in platforms and languages? How does what we do connect best with Ellen’s existing programming experience?

What we have decided is to work on a series of prototypes rather than focus on a monolithic development. The task then it so make sure we use common language for concepts across the project (perhaps using the concepts in the ancient greek as a grounding influence). Similarly, using protocols and formats to share common things between experiments – and fitting with existing programs and archives already in use where practical.

In the other direction, much of our discussion revolves around clarification of weaving practicalities and explanations – both of weaving in general and concepts and styles in use in antiquity. Both Alex and I need to start weaving if we are going to have a subtle enough understanding of the material we are dealing with. To this effect Alex already has a commission for a warp weighted loom in the works and I need to warp up the Harris loom.

We also have some fairly concrete projects to think about over the coming months, for example providing a good live-codable explanation of weaving to replace Ellen’s previous windows program as well as looking into the possibilities of using tangible programming in an art installation.

Future plans

Flotsam: A prototype screenless livecoding language

Two languages are working with Flotsam, the new name for the prototype screenless tangible programming language I’ve been building (which comes from the fact it’s largely made from driftwood). It’s somehow already been featured on the Adafruit blog!

The circuit seems to be fully debugged now, with short circuits fixed – which took a little while and more than a little frustration :) The Raspberry Pi python code is currently on the weavingcodes repository (more on this project on the kairotic site), and the first language is a declarative style L system for describing weave structure and pattern with yarn width and colour. The LEDs indicate that the evaluation happens simultaneously, as this is a functional language. The blocks represent blue and pink yarn in two widths, with rules to produce the warp/weft sequence based on the rows the blocks are positioned on:

The weaving simulation is written in pygame (which I’ve been using lately for teaching), and is deliberately designed to make alternative weave structures than those possible with Jacquard looms by including yarn properties. The version in the video is plain weave, but more complex structures can be defined as below – in the same way as Alex’s gibber software:

star

This is a completely different language for building shapes in Minecraft, and is an imperative, stack based language for driving a turtle in 3D space. Eventually (when I’ve manufactured a few more programming blocks) it will be possible to change Minecraft block materials and react to player actions. The LEDs indicate here the more sequential evaluation of this Forth like language:

All that’s needed to switch languages is to redraw the symbols with chalk and run a different script. It won’t be truly screenless until I write a musical language for it, which is obviously coming soon…

IMG_20141016_142021