Category Archives: Games

Thoughts on teaching programming with Minecraft and Python

Saturday saw the first dBsCode taster workshop, for budding programmers between 11 and 16. We set up 20 Raspberry Pi’s, which we networked together and used our new procedural Minecraft 3D shape primitives to build a number of projects in Python involving castles, spiders and an infinite house generator.

dbsmcpi-castle

dbscode

Networking was very important – it enabled them to jump into each other’s games which started as a major distraction, but ended up being useful – as people could see what each other were doing and work together.

As part of this, some level of ‘griefing’ was present (where other players interfere in your world), but asking them about this during the breaks, the consensus was that this is part of the culture of Minecraft – I even instructed victims to pull their network cables out, but they saw that as completely unnecessary. Most of them took turns in both constructive/collaborative and programming activity and destructive/graffiti like activities associated with griefing, and this was acceptable to them. The use of code to easily rebuild structure after damage (or being able to remove people with liberal procedural application of lava) was therefore quite an attractive feature of the code approach.

It was common to adapt the programming to fit their understanding of Minecraft, so it seemed natural to take on a hybrid approach to building – using Python to do the heavy lifting (building, or extracting large areas) or repetitive tasks, after which they did the fiddly or more creative bits manually as normal. For example one student used the sphere primitive to extract huge caves underground, and then filled them with little hand made shrines and sculptures.

What was interesting was that to some extent the ‘experts’ at Minecraft (many of whom already document their creations on Youtube channels) found this more challenging in many ways than those less attached to the Minecraft culture – who were more accepting of new ways of doing things.

The manner of programming we used here – running Geany (a lightweight Python IDE) at the same time as Minecraft, was completely in line with livecoding practice, as programs interacted with the Minecraft world in realtime with network messages. The great thing about the rather slow network bottleneck is that you can walk around a structure while the program is creating it, which allows a much better understanding of how the process is working than if it instantly popped into existence.

More procedural Minecraft architecture

More work on the Python teaching environment we’ll be using next week for the Minecraft workshop at dBsMusic. I’m working on various ideas for procedural architecture, using this as a way to demonstrate what programming is about – thinking procedurally/functionally. The code is online here – I’ll be adding some exercises and course materials over the next few days.

Edit: Documentation on how to use, also install.sh should work on Raspberry Pi. Exercises to follow.

dbsmcpi-prims

dbsmcpi-houses2

Egglab – pattern generation obsession

I’m putting the final pieces together for the release of the all new Project Nightjar game (due in the run up to Easter, of course!) and the automatic pattern generation has been a focus right up to this stage. The challenge I like most about citizen science is that along with all the ‘normal’ game design creative restrictions (is it fun? will it work on the browser?) you also have to satisfy the fairly whopping constraints of the science itself, determining which decisions impact on the observations you are making – and being sure that they will be robust to peer review in the context of publication – I never had to worry about that with PlayStation games :)

variation

pattern2gen

With this game, similar to the last two, we want to analyse people’s ability to recognise types of pattern in a background image. Crucially, this is a completely different perception process from recognition of a learned pattern (a ‘search image’), so we don’t want to be generating the same exact egg each time from the same description – we don’t want people to ‘learn’ them. This also makes sense in the natural context of course, in that an individual bird’s eggs will not be identical, due to there being many many additional non-deterministic processes happening that create the pattern.

The base images we are using are wrapped Perlin noise at different scales, and with different thresholds applied. These are then rotated and combined with each other and plain colours with the browser’s built in composite operations. Ideally we would generate the noise each time we need it with a different random seed to make them all unique, but this is way too slow for HTML5 Canvas to do (pixel processing in Javascript is still painful at this scale). To get around this we pre-render a set of variations of noise images, the genetic program picks one of four scales, and one of two thresholds (and one without threshold) and we randomly pick a new variation of this each time we render the egg. The image at the top shows the variation that happens across 6 example programs. Below are some of the noise images we’re using:

noise-patterns

Egg camouflage evolution tests in different nest sites

I’ve spent some time testing Project Nightjar EggLab: clicking on algorithmically generated eggs on backgrounds taken from nightjar nest sites and recording the time it takes for each egg. It’s designed for lots of people to play in parallel, but I wanted to test it before coming up with more gameplay mechanic ideas.

The timing is used to rank the eggs, I keep the top 1024 individuals that took longest to find, and generate new ones from them. The idea is that successful traits will increase throughout the population and the average score will increase – from this small test it seems to be the case, a slow but consistent rise over the latest 500 eggs:

fitness

Most of the eggs are still really easy to see, but some of them take a few seconds and every now and again there is a good one that can take longer. These are some nest sites from the fiery-necked nightjar, which seems to consistently favour leafy ground – the last one took me a while to spot:

3

2

4

This are the top 50 eggs for the fiery-necked population, it’s quite noisy with false positives due to the fact that if you get distracted when playing the egg will score highly (this is one of the things to fix):

top50

For comparison, here is the top 50 for the Mozambique nightjar:

top50

These birds nest on a bigger variety of sites, including bare earth – here’s a good one of them:

4

The other project nightjar citizen science games, where you can search for real nightjars and nesting sites can be found here.

Procedural landscape demo on OUYA/Android

A glitchy procedural, infinite-ish landscape demo running on Android and OUYA. Use the left joystick to move around on OUYA, or swiping on Android devices with touchscreens. Here’s the apk, and the source is here.

Screenshot_2014-01-06-07-18-45

It’s great to be able to have a single binary that works across all these devices – from OUYA’s TV screen sizes to phones, and using the standard gesture interface at the same time as the OUYA controller.

The graphics are programmed in Jellyfish Lisp, using Perlin noise to create the landscape. The language is probably still a bit too close to the underlying bytecode in places, but the function calling is working and it’s getting easier to write and experiment with the code.

(define terrain
  '(let ((vertex positions-start)
         (flingdamp (vector 0 0 0))
         (world (vector 0 0 0)))

     ;; recycle a triangle which is off the screen
     (define recycle 
       (lambda (dir)         
         ;; shift along x and y coordinates:
         ;; set z to zero for each vertex
         (write! vertex       
                 (+ (*v (read vertex) 
                        (vector 1 1 0)) dir))
         (write! (+ vertex 1) 
                 (+ (*v (read (+ vertex 1)) 
                        (vector 1 1 0)) dir))
         (write! (+ vertex 2) 
                 (+ (*v (read (+ vertex 2)) 
                        (vector 1 1 0)) dir))

         ;; get the perlin noise values for each vertex
         (let ((a (noise (* (- (read vertex) world) 0.2)))
               (b (noise (* (- (read (+ vertex 1)) 
                               world) 0.2)))
               (c (noise (* (- (read (+ vertex 2))
                               world) 0.2))))

           ;; set the z coordinate for height
           (write! vertex 
                   (+ (read vertex) 
                      (+ (*v a (vector 0 0 8)) 
                         (vector 0 0 -4))))
           (write! (+ vertex 1) 
                   (+ (read (+ vertex 1)) 
                      (+ (*v b (vector 0 0 8)) 
                         (vector 0 0 -4))))
           (write! (+ vertex 2) 
                   (+ (read (+ vertex 2)) 
                      (+ (*v c (vector 0 0 8)) 
                         (vector 0 0 -4))))

           ;; recalculate normals
           (define n (normalise 
                      (cross (- (read vertex)
                                (read (+ vertex 2)))
                             (- (read vertex)
                                (read (+ vertex 1))))))

           ;; write to normal data
           (write! (+ vertex 512) n)
           (write! (+ vertex 513) n)
           (write! (+ vertex 514) n)

           ;; write the z height as texture coordinates
           (write! (+ vertex 1536) 
                   (*v (swizzle zzz a) (vector 0 5 0)))          
           (write! (+ vertex 1537) 
                   (*v (swizzle zzz b) (vector 0 5 0)))          
           (write! (+ vertex 1538) 
                   (*v (swizzle zzz c) (vector 0 5 0))))))

     ;; forever
     (loop 1
       ;; add inertia to the fling/gamepad joystick input
       (set! flingdamp (+ (* flingdamp 0.99)
                          (*v
                           (read reg-fling)
                           (vector 0.01 -0.01 0))))

       (define vel (* flingdamp 0.002))
       ;; update the world coordinates
       (set! world (+ world vel))

       ;; for each vertex
       (loop (< vertex positions-end)         

         ;; update the vertex position
         (write! vertex (+ (read vertex) vel))
         (write! (+ vertex 1) (+ (read (+ vertex 1)) vel))
         (write! (+ vertex 2) (+ (read (+ vertex 2)) vel))

         ;; check for out of area polygons to recycle 
         (cond
          ((> (read vertex) 5.0)
           (recycle (vector -10 0 0)))         
          ((< (read vertex) -5.0)
           (recycle (vector 10 0 0))))
         
         (cond
          ((> (swizzle yzz (read vertex)) 4.0)
           (recycle (vector 0 -8 0)))
          ((< (swizzle yzz (read vertex)) -4.0)
           (recycle (vector 0 8 0))))

         (set! vertex (+ vertex 3)))
       (set! vertex positions-start))))

This lisp program compiles to 362 vectors of bytecode at startup, and runs well even on my cheap Android tablet. The speed seems close enough to native C++ to be worth the effort, and it’s much more flexible (i.e. future livecoding/JIT compilation possibilities). The memory layout is shown below, it’s packing executable instructions and model data into the same address space and doesn’t use any memory allocation while it’s running (no garbage collection and not even any C mallocs). The memory size is configurable but the nature of the system is such that it would be possible to put executable data into unused graphics sections (eg. normals or vertex colours), if appropriate.

jvm

Ouya development experiments

The Ouya is a tiny game console which is designed for promoting indy games rather than traditional high budget productions. It’s cheap compared to standard games hardware, and all the games are free to play at least in demo form. It’s very easy to start making games with as it’s based on Android – you can just plug in a USB cable and treat it just the same as any other Android device. You also don’t need to sign anything to start building stuff – it’s just a case of adding one line to your AndroidManifest.xml to tell the Ouya that the program is a game:

 <category android:name="tv.ouya.intent.category.GAME"/>

and adding a 732×412 icon in “res/drawable-xhdpi/ouya_icon.png” so it shows up on the menu.

130327_ouya_0021

There is a lot to like about the Ouya’s philosophy, so I tried out some graphics experiments with it to get better acquainted:

This program was made using Jellyfish, part of my increasingly odd rendering stack which started out as a port of Fluxus to PS2. It’s a type of microcode written inside TinyScheme and running in a separate virtual machine that makes it possible to process a lot of geometry without garbage collection overheads. At some point I might write a compiler for this, but writing the code longhand at the moment means I can tweak the instruction set and get a better understanding of how to use it. Here is the microcode for the ribbons above, they run at 20,000 cycles per frame each (so about 1.2MHz):

;; register section
   8 20000 0 ;; control (pc, cycles, stack)
   mdl-size prim-tristrip 1 ;; graphics (size, primtype, renderhints)
   0 0 0 ;; pos
   0 0 0 ;; sensor addr

   ;; program data section
   mdl-start 0 0     ;; 4 address of current vertex
   mdl-start 0 0     ;; 5 address of accum vertex (loop)
   0 0 0             ;; 6 influence
   0 0 0             ;; temp

   ;; code section 
   ;; add up differences with every other vertex
   ldi  4  0         ;; load current vertex
   ldi  5  0         ;; load accum vertex
   sub  0  0         ;; get the difference
   lda  6  0         ;; load the accumulation
   add  0  0         ;; accumulate
   nrm  0  0         ;; normalise
   sta  6  0         ;; store accumulation
   add.x 5 1         ;; inc accum address

   ;; accumulation iteration 
   lda  5  0         ;; load accum address
   ldl  mdl-end 0    ;; load end address
   jlt  2  0         ;; exit if greater than model end (relative address)
   jmp  8  0         ;; return to accum-loop

   ;; end accum loop
   ;; push away from other verts
   lda  6  0         ;; load accum
   ldl -0.1 0        ;; reverse & make smaller
   mul 0 0

   ;; attract to next vertex
   ldi 4 0           ;; load current
   ldi 4 1           ;; load next
   sub 0 0           ;; get the difference
   ldl 0.4 0
   mul 0 0           ;; make smaller
   add 0 0           ;; add to accum result

   ;; do the animation
   ldi 4 0           ;; load current vertex
   add 0 0           ;; add the accumulation
   sti 4 0           ;; write to model data

   ;; reset the outer loop
   ldl 0 0           ;; load zero
   sta 6 0           ;; reset accum
   ldl mdl-start 0   
   sta 5 0           ;; reset accum address

   add.x 4 1         ;; inc vertex address
   lda 4  0          ;; load vertex address
   ldl mdl-end 0     ;; load end address
   jlt 2  0          ;; if greater than model (relative address)
   jmp 8  0          ;; do next vertex

   ;; end: reset current vertex back to start
   ldl mdl-start 0
   sta 4 0

   ;; waggle around the last point a bit
   lda mdl-end 0     ;; load vertex pos
   rnd 0 0           ;; load random vert
   ldl 0.5 0         
   mul 0 0           ;; make it smaller
   add 0 0           ;; add to vertex pos
   sta mdl-end 0     ;; write to model

   jmp 8 0

Visualising egg pattern genomes

A couple of screenshots from the upcoming Project Nightjar citizen science game – the genetic programming pattern generator is now working in a simple test framework, and even with myself as the only player at the moment, it’s gradually producing eggs that are harder and harder to find against one of the background images from the field site in South Africa. Today I’ve been working on a viewer for the pattern genome itself, which displays all the base images and the operations and intermediate steps as it builds up the final image. Unlike any natural form of genetics, genetic programming is all about growing trees of functions connected together, and here we are interested in combining simple images using HTML5 canvas’s composite operations to make complex patterns.

genome1

genome2

This will be useful for debugging mutations, where the sub-trees are jumbled up, but as we’re building a citizen science game, we’re also going to be exposing as much as possible about how it’s working – from the game mechanics like this to the underlying camouflage theories we’ll be testing. If you recognise the graph drawing algorithm, I’ve been plundering a long forgotten project: fastbreeder.

More procedurally rendered eggs in HTML5 canvas

The first Project Nightjar game was a big success, with 6 thousand players in the first few days – so we’ll have lots of visual perception data to get through! Today I’ve been doing a bit more work on the egg generator for the next citizen science camouflage game:

egglab3

I’ve made 24 new, more naturalistic base images than the abstract ones I was testing with before, and implemented the start of the genetic programming system – each block of 4×4 eggs here are children of a single randomly created parent, each child is created with a 1% mutation rate. The programs trees themselves are 6 levels deep, so a maximum of 64 binary composite operations.

All the genetic programming effort will happen in HTML5, thus neatly scaling the processing with the number of players, which is going to be important if this game proves as popular as the last – all the server has to do then is keep a record of the genotypes (the program trees) and their corresponding fitness.

One catch with this approach is the implementation of globalCompositeOperation in HTML5, the core of the image synthesis technique I’m using, is far from perfect across all browsers. Having the same genotype look different to different people would be a disaster, so I’m having to restrict the operations to the ones consistently supported – “source-over”,”source-atop”,”destination-over”,”destination-out”,”lighter” and “xor”.