Category Archives: Visual Programming

More kite based UAV toolkit action

Some further kite testing with UAV Toolkit last week at Gwithian beach, with a strong offshore wind (and rather good looking surf too). We managed to max out the kite altitude and get some great photographs including surfers and flocks of birds. See the previous kite post for more details on the kite.

photo-1428496974-445621

photo-1428500394-816255

photo-1428497002-687142

I’ve also started experimenting with combining the sensor data with the images to provide geo-referenced images in GeoTIFF format. This uses the magnetometer data to orient the image and GPS for position. This is still work in progress with quite a lot of converting between coordinates going on, all using the GDAL suit of tools with python to glue it all together:

kitewalk

Kite mapping with UAV toolkit

Some photos taken by the UAV toolkit on a recent flight at our gyllyngvase beach test site, using a KAP foil 1.6 kite instead of a drone. Kites have many advantages, no flight licences required, no vibration from engines and a fully renewable power source!

photo-1427901150-33648

We’re using a 3D printed mounting plate for the phone strung from the top of the single line just below the kite. It needs more wind than we had to get higher altitudes but the first impressions are good. I’ve also added a new trigger mode to the UAV toolkit programming language that remembers the GPS coordinates where all the photos are taken, so it can build up overlapping images even if the movement is harder to control.

photo-1427899695-642548

photo-1427900271-271541

The tail of the kite – which turned out to be important for stabilising the flight.

photo-1427900189-120468

photo-1427900242-710290

Here is the code using the when-in-new-location trigger to calculate overlap based on the camera angle, gps and altitude – which ideally should be driven somehow by the length of the line. As an aside, this screenshot was taken in the chrome browser which now runs android apps.

newlocation

Test flight day!

View of ground control from a OnePlus phone mounted on a Y6 UAV:

photo-1421927742-36242

A report from the first flight test of the new UAV android software with the Exeter University UAV science group. We had two aircraft, a nice battle hardened fixed wing RC plane and a very futuristic 3D robotics RTF Y6. We also had two phones for testing, an old cheap Acer Liquid Glow E330 (the old lobster phone) and a new, expensive Oneplus One A0001. Both were running the same version (0.2) of the visual programming toolkit which I quietly released yesterday.

IMG_20150122_104141

Here is the high-tech mounting solution for the RC aircraft. There were a lot of problems with the Acer, I’m not sure if the GPS triggering was happening too fast or if there is a problem with this particular model of phone but the images appear to be corrupted and overwriting each other (none of this happened in prior testing of course :) Despite this, there were some Ok shots, but a lot of vibration from the petrol motor during acceleration.

photo-1421923592-621148

photo-1421923613-98013

We tried two types of programs running on the phones, one triggered photos by a simple timer, the other used GPS distance, altitude and camera angle in order to calculate an overlap coverage. Both seems to work well, although I need to go through the sensor data for each image to check the coverage by positioning the images using the GPS. One thing I was worried about was the pitch and yaw of the aircraft – but with the Y6 this was extremely stable, along with the altitude too, which can be controlled automatically at a set height.

The vibration seemed less of a problem on the Y6, but on one of the flights the power button got pressed bringing up the keylock screen which annoyingly prevents the camera from working. We did however capture lots of sensor data – accelerometer, magnetometer, orientation and gravity with no problems on the Acer.

The OnePlus phone worked pretty flawlessly overall, and we left it till last as it’s a bit less expendable! It’s possible to mount phones easily underneath the batteries on the Y6 without the need for tape, which looks a bit more professional:

IMG_20150122_114937

We still have problems with vibration, which seems to cause the bands of fuzziness (see the bottom and top photos) so things to look at next include:

  1. Cushioning for the phone (probably just a small bit of foam).
  2. Reproducing and fixing the Acer camera problem.
  3. Some kind of audio indication from the phone that the camera is working etc.
  4. Try again to lock the keys on the phone or override the key lock screen.
  5. More camera controls, override and lock the exposure.
  6. Output raw files instead of running the jpeg compression in the air! This seems to take longer than actually taking the photo, and we don’t care about space on the sdcard.

IMG_20150122_113543

photo-1421927801-544702

Livecoding UAVs for environmental research

Some screenshots of the UAV livecoding visual programming language. Weather being on our side, we’re planning some test flights later this week! The first program uses GPS to take photos with an overlap of 50% at 300 metres altitude, based on the vertical camera angle as reported from the device. It assumes the the flight orientation is level:

Screenshot_2015-01-20-23-17-49

The blocks are all drag and drop and get converted into Scheme code which is run by a modified tinyscheme interpreter. The code can be saved and loaded, and I’m planning to make it possible for people to share code via email.
This is a simpler program which takes a photo every 3 seconds and records a handful of sensor data to the database:

Screenshot_2015-01-20-23-18-44

At the bottom you can see a squashed camera preview – I’ve tried various approaches (hiding, scaling to 0 pixels etc) but android requires that there is a preview somewhere in order to take a photo properly. You can view the recorded data on the device too, for checking. There is also a ‘flight mode’ which locks and turns off the screen, and ignores all button events. On some phones you need to take out the battery to stop the program running but unfortunately on others you can still use the power button to close the program.

Screenshot_2015-01-21-01-09-56

Visual programming for environmental research with UAVs

ua3

I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?

Enumerating and displaying all the sensors on a phone
Enumerating and displaying all the sensors on a phone

This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.

Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.

Nested drag/droppable code which gets converted to Scheme code for the tinyscheme interpreter
Nested drag/droppable code which gets converted to Scheme for the tinyscheme interpreter
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense - just testing!)
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense – just testing!)
The "Hello World" program, displays every 3 seconds  (even when the app is running in the background)
The "Hello World" program, displays every 3 seconds (even when the app is running in the background)

Weaving notations

I’ve collected images from our workshops in Leeds and Sheffield as we attempted to understand the intricacies of weaving, particularly ancient looms from antiquity.

A simple visualisation of a loom.

A simple visualisation of a loom

Attempting to understand the relationship between lift plans and heddles.

Attempting to understand the relationship between lift plans and heddles.

The connection between tablet woven bands, whose weft form the warp of the warp-weighted loom.

The connection between tablet woven bands, whose weft form the warp of the main weaving.

A pattern language, turning into binary pattern, then thinking about the types of patterns the weavers of antiquity favoured.

A pattern language, turning into binary pattern, then thinking about the types of patterns the weavers of antiquity favoured.

More tablet/warp weighted calculations, grouping threads by colour.

More tablet/warp weighted calculations, grouping threads by colour.

A professional lift plan, by Leslie Downes.

A professional lift plan, by Leslie Downes.

Flotsam: A prototype screenless livecoding language

Two languages are working with Flotsam, the new name for the prototype screenless tangible programming language I’ve been building (which comes from the fact it’s largely made from driftwood). It’s somehow already been featured on the Adafruit blog!

The circuit seems to be fully debugged now, with short circuits fixed – which took a little while and more than a little frustration :) The Raspberry Pi python code is currently on the weavingcodes repository (more on this project on the kairotic site), and the first language is a declarative style L system for describing weave structure and pattern with yarn width and colour. The LEDs indicate that the evaluation happens simultaneously, as this is a functional language. The blocks represent blue and pink yarn in two widths, with rules to produce the warp/weft sequence based on the rows the blocks are positioned on:

The weaving simulation is written in pygame (which I’ve been using lately for teaching), and is deliberately designed to make alternative weave structures than those possible with Jacquard looms by including yarn properties. The version in the video is plain weave, but more complex structures can be defined as below – in the same way as Alex’s gibber software:

star

This is a completely different language for building shapes in Minecraft, and is an imperative, stack based language for driving a turtle in 3D space. Eventually (when I’ve manufactured a few more programming blocks) it will be possible to change Minecraft block materials and react to player actions. The LEDs indicate here the more sequential evaluation of this Forth like language:

All that’s needed to switch languages is to redraw the symbols with chalk and run a different script. It won’t be truly screenless until I write a musical language for it, which is obviously coming soon…

IMG_20141016_142021

Neural Network livecoding and retrofitting ZX Spectrum hardware

An experimental, and quite angry neural network livecoding synth (with an audio ‘weave’ visualisation) for the ZX Spectrum: source code and TZX file (for emulators). It’s a bit hard to make out in the video, but you can move around the 48 neurons and modify their synapses and trigger levels. There are two clock inputs and the audio output is the purple neuron at the bottom right. It allows recurrent loops as a form of memory, and some quite strange things are possible. The keys are:

  • w,d: move diagonally north west <-> south east
  • s,e: move diagonally south west <-> north east
  • t,y,g,h: toggle incoming synapse connections for the current neuron
  • space: change the ‘threshold’ of the current neuron (bit shifts left)

This audio should load up on a real ZX Spectrum:

One of the nice things about tech like this is that it’s easily hackable – this is a modification to the video output better explained here, but you can get a standard analogue video signal by connecting the internal feed directly to the plug and detaching the TV signal de-modulator with a tiny bit of soldering. Look at all those discrete components!

IMG_20140809_125815

Scratch -> Lego Mindstorms

A bit of hardware hacking for Troon Primary CodeClub, who have tons of old style Lego Mindstorms they don’t use any more, and after a year of Scratch programming on their PCs are just getting started with Raspberry Pi. We’re using this Scratch modification together with the hardware I’m making which is based on this circuit. The main thing here is an L293D Motor Controller IC which can drive 2 DC motors in both directions. You can write the hardware code in Scratch like this to control the lego motors:

blink11

The most tricky part in this whole endeavour has been physically connecting to Mindstorms. At the moment I’m having to use crocodile clips which won’t work long in normal classroom conditions – but I’m wary of destroying/modifying the connectors as they’re not made any more…

IMG_20140317_111808

Open Sauces: A structured recipe notebook

It’s been busy in the android department this week. Here’s a prototype recipe notebook for open sauces. At this point it’s primarily a test of a drag/drop interface to make it possible for anyone to make the recipe graphs we explored visualising in October. I’m testing it with the recipes designed for FoAM’s recent Smoke and Vapour event.

Currently it’s structured a bit like Scheme Bricks where you ‘read it’ inside -> outside, and right to left. We need to look at ways to reverse this – it might be simply making everything float to the right, or more like a bottom up tree, where the finished dish is at the bottom rather than the top.

Screenshot_2013-12-14-07-39-46

10747546824_664a0d3318_b