Working on the upcoming Raspberry Pi programming workshop for dBsCode, I’m wrapping the Minecraft Python API with a functional style one to reduce the amount of syntax we’ll have to teach. The idea is to build complex 3D shapes via abstraction, out of simple primitives.
The IDE we’re using is Geany which seems to run well alongside Minecraft on the Raspberry Pi so far. It’s great how Minecraft stays on top of the display at all times – more an unintentional feature of the GPU driver, but very useful for teaching.
West Cornwall’s CodeClub team (Glen Pike and I) bring you the next stage at dBs Music:
A bit of hardware hacking for Troon Primary CodeClub, who have tons of old style Lego Mindstorms they don’t use any more, and after a year of Scratch programming on their PCs are just getting started with Raspberry Pi. We’re using this Scratch modification together with the hardware I’m making which is based on this circuit. The main thing here is an L293D Motor Controller IC which can drive 2 DC motors in both directions. You can write the hardware code in Scratch like this to control the lego motors:
The most tricky part in this whole endeavour has been physically connecting to Mindstorms. At the moment I’m having to use crocodile clips which won’t work long in normal classroom conditions – but I’m wary of destroying/modifying the connectors as they’re not made any more…
My second day of teaching was followed by a presentation by Ellen Harlizius-Klück and Alex McLean on weaving, ancient mathematics, programming, mythology and music – which provided a great introduction for a meeting we had the next day on an upcoming project bringing these concepts together.
Last week Alex and I took to the road on another slub mini-tour starting in Denmark at the Kunsthal Aarhus where we ran a livecoding workshop and performed at the opening of the Aarhus Filmfestival.
The Kunsthal gallery was exhbiting “Systemics #2: As we may think (or, the next world library)” with work by Florian Hecker, Linda Hilfling, Jakob Jakobsen, Suzanne Treister, Ubermorgen, YoHa + Matthew Fuller.
Linda Hilfling and UBERMORGEN’s work comprised an Amazon print on demand hack which was perhaps an even more elaborate version of their previous Google Will Eat Itself. The gallery floor was printed with a schematic describing the processing from the raw material input to the finished printed books.
Suzanne Treister’s work called HEXEN 2.0 included alternative/hidden histories of technology presented as densely descriptive tarot cards and prints showing many connections between individuals, events and inventions.
[Continued from part 1] On day one, after we introduced the project and the themes we wanted to explore, Ryan Jordan had a great idea of how to prototype the bike-bike communication using FM radio transmissions. He quickly freeform built a short range FM transmitter powered by a 9v battery.
The next thing we needed was something to transmit – and another experiment was seeing how accelerometers responded during bike riding on different terrains. I’d been playing with running the fluxa synth code in Android native audio for a while, so I plugged the accelerometer input into parameters of a simple ring modulation synth to see what would happen. We set off with the following formation:
The result was that the vibrations and movements of a rider were being transmitted to the other bikes for playback, including lots of great distortion and radio interference. As the range was fairly short, it was possible to control how much of the signal you received – as you cycled away from the “source cyclist”, static (and some BBC radio 2) started to take over.
We needed to tune the sensitivity of the accelerometer inputs – as this first attempt was a little too glitchy and overactive, the only changes really discernible were the differences between the bike moving and still (and it sounded like a scifi laser battle in space). One of the great things about prototyping with android was that we could share the package around and run it on loads of phones. So we went out again with three bikes playing back their own movements with different synth settings.
Time to report on the sonic bike hacklab Kaffe Matthews and I put on in AudRey HQ in Hackney. We had a sunny and stormy week of investigation into sonic bike technology. After producing three installations with sonic bikes, the purpose of the lab was to open the project up to more people with fresh ideas, as well as a chance to engage with the bikes in a more playful research oriented manner without the pressure of an upcoming production.
Each of the three previous installation, in Ghent, Hailuoto island, Finland, and Porto, we’ve used the same technology, a Beagleboard using a GPS module to trigger samples to play back over speakers mounted on the handlebars. The musical score is a map created using Ushahidi consisting of zones tagged with sample names and playback parameters that the bikes carry around with them.
We decided to concentrate on two areas of investigation, using the bike as a musical instrument and finding ways to get the bikes to talk to each other (rather than being identical independent clones). We had a bunch of different components to play with, donated by the participants, Kaffe and I – while the bikes already provided power via 12v batteries, amplification and speakers. We focused on tech we could rapid prototype with minimal fuss.
The next few posts will describe the different experiments we carried out using these components.
Deershed is a music festival designed to accommodate families with lots of activities for children. Part of this year’s festival was a Machines Tent, including Lego robot building, Mechano constructions, 3D printing and computer games.
Slub’s daily routine in the Machines Tent started by setting up the Al Jazari gamepad livecoding installation, a couple of hours with Martyn Eggleton teaching Scratch programming on an amazing quad Raspberry Pi machine (screens/processors and keyboards all built into a welded cube).
At some point we would switch to Minecraft, trying some experiments livecoding the LAN game world using Martyn’s system to access the Minecraft API using Waterbear, a visual programming language using a similar blocks approach as Scratch and Scheme Bricks.
During the afternoons Alex and I could try some music livecoding experiments. This was a great environment for playful audience participatory performances, with families continually passing through the tent I could use a dancemat to trigger synths in fluxus while Alex livecoded music designed to encourage people to jump up and down.
One of the most interesting things for me was to be able to see how lots of children (who mostly didn’t know each other) collaborate and self organise themselves in a LAN game, there was quite a pattern to it with all the groups:
- Mess around with Minecraft as usual (make some blocks, start building a house).
- Find something built by someone else, destroy a few bricks.
- Snap out of the game to notice that the other kids are complaining.
- Realise that there are other people in the world – and they are sat around them!
- Attempt to fix the damage.
At this point other people would join in to help fix things, after which there would be some kind of understanding reached between them to respect each other’s creations. This has all really inspired me to work on Al Jazari 2 which combines a lot of these ideas.
Last week I was kindly invited by Julian Rohrhuber to do a couple of talks and teach a livecoding workshop alongside Jan-Kees van Kampen at the Düsseldorf Institute for Music and Media. Jan-Kees was demoing /mode +v noise a Supercollider chat bot installation using IRC, so it was the perfect opportunity to play test the work-in-progress slubworld project, including the plutonian botzlang language. It also proved a good chance to try using a Raspberry Pi as a LAN game server.
There wasn’t enough time to get deeply into botzlang, but we were able to test the text to sound code that Alex has been working on with a good sound system, and the projection of the game world that visualises what is happening based on the Naked on Pluto library installation:
The Raspberry Pi was useful as a dedicated server I could set up beforehand and easily plug into the institutes wireless router. We didn’t need to worry about internet connectivity, and everyone could take part by using a browser pointed at the right IP address. With access to the “superuser” commands from the Naked on Pluto game, the participants had quite a bit of fun making objects and dressing each other up in different items, later making and programming their own bots to say things that were sonified through the speakers.
Visual programming languages can also be drawn. From the slub workshop at VIVO Mexico City, photo credit to AFADireccionCreativa.