Category Archives: android

The UAV toolkit & appropriate technology

The UAV toolkit’s second project phase is now complete, the first development sprint at the start of the year was a bit of research into what we could use an average phone’s sensors for, resulting in a proof of concept remote sensing android app that allowed you to visually program different scripts which we then tested on some drones, a radio controlled plane and a kite.

photo-1444743618-504214

This time we had a specific focus on environmental agencies, working with Katie Threadgill at the Westcountry Rivers Trust has meant we’ve had to think about how this could be used by real people in an actual setting (farm advisors working with local farmers). Making something cheap, open source and easy to use, yet open ended has been the focus – and we are now looking at providing WRT with a complete toolkit which would comprise a drone (for good weather) a kite (for bad weather/no flight licences required) and an android phone so they don’t need to worry about destroying their own if something goes wrong. Katie has produced this excellent guide on how the app works.

The idea of appropriate technology has become an important philosophy for projects we are developing at Foam Kernow, in conjunction with unlikely connections in livecoding and our wider arts practice. For example the Sonic Bike project – where from the start we restricted the technology so that no ‘cloud’ network connections are required and all the data and hardware required has to fit on the bike – with no data “leaking” out.

photo-1444742698-352249

With the UAV toolkit the open endedness of providing a visual programming system that works on a touchscreen results in an application that is flexible enough to be used in ways and places we can’t predict. For example in crisis situations, where power, networking or hardware is not available to set up remote sensing devices when you need them most. With the UAV toolkit we are working towards a self contained system, and what I’ve found interesting is how many interface and programming ‘guidelines’ I have to bend to make this possible – open endedness is very much against the grain of contemporary software design philosophy.

The “app ecosystem” is ultimately concerned with elevator pitches – to do one thing, and boil it down to the least actions possible to achieve it. This is not a problem in itself, but the assumption that this is the only philosophy worth consideration is wrong. One experience that comes to mind recently is having to make and upload banner images of an exact size to the Play Store before it would allow me to release an important fix needed for Mongoose 2000, which is only intended to ever have 5 or 6 users.

photo-1444743271-137365

For the UAV toolkit, our future plans include stitching together photos captured on the phone and producing a single large map without the need to use any other software on a laptop. There are also interesting possibilities regarding distributed networking with bluetooth and similar radio systems – for example sending code to different phones is needed, as currently there is no way to distribute scripts amongst users. This could also be a way of creating distributed processing – controlling one phone in a remote location with another via code sent by adhoc wifi or SMS for example.

Airborne drag-drop programming, the next steps

This autumn we are continuing work on the UAV toolkit with Karen Anderson and her research group at the Environment and Sustainability Institute. This time we have a mission to help the Westcountry Rivers Trust by coming up with fast and cheap ways they can build maps of farms to determine water run-off problems, which gives farmers proof they need to get funding to fix pollution issues.

IMG_20151014_154601
Flight planning operations and photo checking

In order to make the software usable in this case, we decided on two directions. On the one hand there needs to be a simple way to start and stop programs (or “flight modes”) that read sensor data, as well as defining certain global settings, ie. flight altitude, desired image coverage etc. At the same time, the code to define what this does needs to still be programmable in the app – and more complex behaviours need to be possible to support both kites and UAVs. Our philosophy is that it has to be open ended, as we don’t know where the toolkit it might be useful (ie. crisis mapping situations) or what new sensors will be available on a device in the future.

Screenshot_2015-10-19-21-48-01
The new main screen

One specific set of new behaviours we need is for kite mapping. We already have the ability to choose when to take pictures based on GPS and altitude, but with a kite there can be lots of turbulence and the camera is in a much less controlled state, flipping around taking shots of the sky etc. So we need to calculate things like jerk from change in acceleration and use orientation sensors to only take photos when the lens is pointing directly down, within some degree of acceptable margin.

Below is a section of the code that calculates if we are pointing down using the magnetometer and accelerometer – the drag drop visual code can now be used to build normal Scheme functions using a touchscreen (a bit like scheme bricks). In fact I managed to do all of this work on the phone. There are now two types of code, the main programs or “flight modes” that you can run from the front screen, and a library of editable functions which they use. This means there are now three levels that the software can be used – using it without needing to see any of the code at all, editing the basic behaviour like which sensor’s data are captured, and finally modifying the more detailed code to make it do completely new things.

Screenshot_2015-10-19-22-08-12

Mongoose 2000 version 2

Mongoose 2000 version 2 is now being used in the Banded Mongoose Research Project Fieldsite on the Mweya Peninsula, in the Queen Elizabeth National Park, western Uganda.

We’ve added two new focal observations – where a single mongoose in a specific life stage is followed, and has it’s activity recorded for 20 minutes. These observations include different events that can happen (fighting or cooperating with other individuals etc). Nearly all the interfaces are shown below – the system includes adding new packs or individuals, data review and syncronisation with other tablets via the Raspberry Pi.

interfaces-s

Mongoose 2000 in the wilds of Uganda

One of our most ambitious projects: Mongoose 2000, is now up and running after 6 months of testing. This is a Raspberry Pi and Android tablet system to synchronise and store masses of data for a long running behavioral experiment recording the activities of packs of mongooses in the field site in Uganda. They broad aims of the project are to study mongoose behaviour in order to understand the evolution of society.

use

The timespan that we are working with is long, and the location – while not as remote as it could be (there is some internet access and power) required some consideration – so this was a project where we really needed to employ an appropriate technology approach which is manifested in various ways:

Open source software: means that Foam Kernow are not a bottle neck to continued development, as we do not have exclusive control over the source code (which is released into the commons). New developers can be found if required (for whatever reason) who do not need to start from nothing – this gives the research team more control and future proofing.

Use of commodity hardware: it’s likely that the hardware in use will become obsolete in this timeframe. The lifespan of android software should mean it can be installed on compatible devices for a long time, and the team can make use of advances in sensor or battery technology. The raspberry pi we are using is already an older model now, but as it’s a standard linux setup we can easily move it to other machines in the future. Currently it acts as an ‘appliance’ which just needs to be turned on, but we can add a web interface to control it from the tablet – or eventually replace it with a peer to peer syncronisation system.

The Ugandans working on the project have an healthy DIY relationship with technology, they expect to be able to repair or modify things themselves, and I’d like to figure out ways we can work with this more. The UAV toolkit project provides some indication of what can be done with programming these kinds of devices in the field. Part of the decisions on the hardware (and the design of the software, e.g. using a scheme interpreter) were to use devices that were open to a more end-user programming approach in the future.

use2

use3

Syncronising a tablet with observations previously recorded on the Raspberry Pi:

use4

Kite mapping with UAV toolkit

Some photos taken by the UAV toolkit on a recent flight at our gyllyngvase beach test site, using a KAP foil 1.6 kite instead of a drone. Kites have many advantages, no flight licences required, no vibration from engines and a fully renewable power source!

photo-1427901150-33648

We’re using a 3D printed mounting plate for the phone strung from the top of the single line just below the kite. It needs more wind than we had to get higher altitudes but the first impressions are good. I’ve also added a new trigger mode to the UAV toolkit programming language that remembers the GPS coordinates where all the photos are taken, so it can build up overlapping images even if the movement is harder to control.

photo-1427899695-642548

photo-1427900271-271541

The tail of the kite – which turned out to be important for stabilising the flight.

photo-1427900189-120468

photo-1427900242-710290

Here is the code using the when-in-new-location trigger to calculate overlap based on the camera angle, gps and altitude – which ideally should be driven somehow by the length of the line. As an aside, this screenshot was taken in the chrome browser which now runs android apps.

newlocation

Test flight day!

View of ground control from a OnePlus phone mounted on a Y6 UAV:

photo-1421927742-36242

A report from the first flight test of the new UAV android software with the Exeter University UAV science group. We had two aircraft, a nice battle hardened fixed wing RC plane and a very futuristic 3D robotics RTF Y6. We also had two phones for testing, an old cheap Acer Liquid Glow E330 (the old lobster phone) and a new, expensive Oneplus One A0001. Both were running the same version (0.2) of the visual programming toolkit which I quietly released yesterday.

IMG_20150122_104141

Here is the high-tech mounting solution for the RC aircraft. There were a lot of problems with the Acer, I’m not sure if the GPS triggering was happening too fast or if there is a problem with this particular model of phone but the images appear to be corrupted and overwriting each other (none of this happened in prior testing of course :) Despite this, there were some Ok shots, but a lot of vibration from the petrol motor during acceleration.

photo-1421923592-621148

photo-1421923613-98013

We tried two types of programs running on the phones, one triggered photos by a simple timer, the other used GPS distance, altitude and camera angle in order to calculate an overlap coverage. Both seems to work well, although I need to go through the sensor data for each image to check the coverage by positioning the images using the GPS. One thing I was worried about was the pitch and yaw of the aircraft – but with the Y6 this was extremely stable, along with the altitude too, which can be controlled automatically at a set height.

The vibration seemed less of a problem on the Y6, but on one of the flights the power button got pressed bringing up the keylock screen which annoyingly prevents the camera from working. We did however capture lots of sensor data – accelerometer, magnetometer, orientation and gravity with no problems on the Acer.

The OnePlus phone worked pretty flawlessly overall, and we left it till last as it’s a bit less expendable! It’s possible to mount phones easily underneath the batteries on the Y6 without the need for tape, which looks a bit more professional:

IMG_20150122_114937

We still have problems with vibration, which seems to cause the bands of fuzziness (see the bottom and top photos) so things to look at next include:

  1. Cushioning for the phone (probably just a small bit of foam).
  2. Reproducing and fixing the Acer camera problem.
  3. Some kind of audio indication from the phone that the camera is working etc.
  4. Try again to lock the keys on the phone or override the key lock screen.
  5. More camera controls, override and lock the exposure.
  6. Output raw files instead of running the jpeg compression in the air! This seems to take longer than actually taking the photo, and we don’t care about space on the sdcard.

IMG_20150122_113543

photo-1421927801-544702

Livecoding UAVs for environmental research

Some screenshots of the UAV livecoding visual programming language. Weather being on our side, we’re planning some test flights later this week! The first program uses GPS to take photos with an overlap of 50% at 300 metres altitude, based on the vertical camera angle as reported from the device. It assumes the the flight orientation is level:

Screenshot_2015-01-20-23-17-49

The blocks are all drag and drop and get converted into Scheme code which is run by a modified tinyscheme interpreter. The code can be saved and loaded, and I’m planning to make it possible for people to share code via email.
This is a simpler program which takes a photo every 3 seconds and records a handful of sensor data to the database:

Screenshot_2015-01-20-23-18-44

At the bottom you can see a squashed camera preview – I’ve tried various approaches (hiding, scaling to 0 pixels etc) but android requires that there is a preview somewhere in order to take a photo properly. You can view the recorded data on the device too, for checking. There is also a ‘flight mode’ which locks and turns off the screen, and ignores all button events. On some phones you need to take out the battery to stop the program running but unfortunately on others you can still use the power button to close the program.

Screenshot_2015-01-21-01-09-56

Visual programming for environmental research with UAVs

ua3

I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?

Enumerating and displaying all the sensors on a phone
Enumerating and displaying all the sensors on a phone

This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.

Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.

Nested drag/droppable code which gets converted to Scheme code for the tinyscheme interpreter
Nested drag/droppable code which gets converted to Scheme for the tinyscheme interpreter
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense - just testing!)
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense – just testing!)
The "Hello World" program, displays every 3 seconds  (even when the app is running in the background)
The "Hello World" program, displays every 3 seconds (even when the app is running in the background)

Symbai field site testing

Some photos from Shakti Lamba who is currently testing Symbai in the Chhattisgarh state in north eastern India.

ByC5r0BIMAAgr6N.jpg:large

The whole system is solar powered, and provides it’s own networking via the Raspberry Pi synchronisation node shown here. The android tablets also recharge from the same power source. The Raspberry Pi networking is a direct descendant of the experiments we carried out in London during the Sonic Bike workshop.

ByDBImzIUAAw_Tw.jpg:large

Here are three of the tablets syncing their data – photographs for people to identify each other (names are used differently to western culture), audio recordings of verbal agreements (a requirement in preliterate societies), and information of who knows who.

ByC7REBIMAAToql.jpg:large

More on this free software project, links to source etc over at foam kernow.

Screenshot_2014-09-15-10-47-48