Category Archives: Games

More PPU coding on the NES/Famicom

After getting sprites working in Lisp on the NES for our “What Remains” project, the next thing to figure out properly is the background tiles. With the sprites you simply have a block of memory you edit at any time, then copy the whole lot to the PPU each frame in one go – the tiles involve a bit more head scratching.

The PPU graphics chip on the NES was designed in a time where all TVs were cathode ray tubes, using an electron gun to build a picture up on a phosphor screen. As this scans back and forth across the screen the PPU is busy altering its signal to draw pixel colours. If you try and alter its memory while its doing this you get glitches. However, its not drawing all the time – the electron gun needs to reset to the top of the screen each frame, so you get a window of time (2273 cycles) to make changes to the PPU memory before it starts drawing the next frame.

0014
(Trying out thematic images and some overlapping text via the display list)

The problem is that 2273 cycles is not very much – not nearly enough to run your game in, and only enough to update approx 192 background tiles per frame as DMA is a slow operation. It took me a while to figure out this situation – as I was trying to transfer an entire screenful in one go, which sort of works but leaves the PPU in an odd state.

The solution is a familiar one to modern graphics hardware – a display list. This is a buffer you can add instructions to at any time in your game, which are then acted on only in the PPU access window. It separates the game code from the graphics DMA, and is very flexible. We might want to do different things here, so we can have a set of ‘primitives’ that run different operations. Given the per-frame restriction the buffer can also limit the bandwidth so the game can add a whole bunch of primitives in one go, which are then gradually dispatched – you can see this in a lot of NES games as it takes a few frames to do things like clear the screen.

There are two kinds of primitives in the what remains prototype game engine so far, the first sets the tile data directly:


(display-list-add-byte 1)
(display-list-add-byte 2)
(display-list-add-byte 3)
(display-list-end-packet prim-tile-data 0 0 3)

This overwrites the first 3 tiles at the top left of the screen to patterns 1,2 and 3. First you add bytes to a ‘packet’, which can have different meanings depending on the primitive used, then you end the packet with the primitive type constant, a high and low 16 bit address offset for the PPU destination, and a size. The reason this is done in reverse is that this is a stack, read from the ‘top’ which is a lot faster – we can use a position index that is incremented when writing and decremented when reading.

We could clear a portion of the screen this way with a loop (a built in language feature in co2 Lisp) to add a load of zeros to the stack:


(loop n 0 255 (display-list-add-byte 0))
(display-list-end-packet prim-tile-data 0 0 256)

But this is very wasteful, as it fills up a lot of space in the display list (all of it as it happens). To get around this, I added another primitive called ‘value’ which does a kind of run length encoding (RLE):


(display-list-add-byte 128) ;; length
(display-list-add-byte 0) ;; value
(display-list-end-packet prim-tile-value 0 0 2)

With just 2 bytes we can clear 128 tiles – about the maximum we can do in one frame.

Cricket Tales released

Cricket Tales is an ambitious citizen science project. 438 days of CCTV footage from the Wild Crickets Research group – the only record of wild behaviour of insects of it’s kind. It turns out that insects have more complex lives and individuality than we thought, and the game is a way of helping uncover this more precisely. For Foam Kernow, this was also a significant project as the biggest production that all three of us have worked on together.

title

My favorite aspect of this project is that the movies are a strangely different way of viewing an ecosystem, tiny close up areas of a perfectly normal field in northern Spain. The footage is 24 hour, with infrared at night, recording a couple of frames a second only when movement is detected. Some of the videos get triggered when there is simply movement of shadows, but there are plenty of moments that we wouldn’t normally notice. Worms and bugs of all kinds going about their lives, sudden appearances of larger animals or swarms of ants, condensation of dew at dawn. The crickets themselves, mostly with tags stuck to them so we can tell which is which, but other than that – this is their normal habitat and way of life. Compared to the study of insects in lab conditions, it’s not surprising they act in a more complex way.

movie2
Screenshots from the Spanish version, as I’m particularly proud of that (my first experience using GNU gettext with Django).

We combined the task of watching the 1 minute long movies with the ability to build houses for the crickets – we needed to provide a way for people to leave something behind, something that marks progress on this gigantic collective task. You get to design a little house for each burrow, and your name gets recorded on the meadow until the next person takes over by watching more videos.

map2

We’ve had plenty of conversations about what kind of people take part in this sort of citizen science activity, what the motivations may be. We ask a couple of questions when people sign up, and this is something we are interested in doing more research on in general for our projects. In this case, we are interested in depth of involvement more than attracting thousands of brief encounters – it only takes a few motivated people to make the researcher’s jobs much easier and provide some data they need.

For me a bigger objective of Cricket Tales is as a way to present more diverse and personal views of the world that surround us, and tends to go unnoticed. Being asked to contemplate a tiny organism’s view of the world for a minute can be quite an eye opener.

A 6502 lisp compiler, sprite animation and the NES/Famicom

For our new project “what remains”, we’re regrouping the Naked on Pluto team to build a game about climate change. In the spirit of the medium being the message, we’re interested in long term thinking as well as recycling e-waste – so in keeping with a lot of our work, we are unraveling the threads of technology. The game will run on the NES/Famicom console, which was originally released by Nintendo in 1986. This hardware is extremely resilient, the solid state game cartridges still work surprisingly well today, compared to fragile CDROM or the world of online updates. Partly because of this, a flourishing scene of new players are now discovering them. I’m also interested that the older the machine you write software for, the more people have access to it via emulators (there are NES emulators for every mobile device, browser and operating system).

nes
Our NES with everdrive flashcart and comparatively tiny sdcard for storing ROMs.

These ideas combine a couple of previous projects for me – Betablocker DS also uses Nintendo hardware and although much more recent, the Gameboy DS has a similar philosophy and architecture to the NES. As much of the machines of this era, most NES games were written in pure assembly – I had a go at this for the Speccy a while back and while being fun in a mildly perverse way, it requires so much forward planning it doesn’t really encourage creative tweaking – or working collaboratively. In the meantime, for the weavingcodes project I’ve been dabbling with making odd lisp compilers, and found it very productive – so it makes sense to try one for a real processor this time, the 6502.

The NES console was one of the first to bring specialised processors from arcade machines into people’s homes. On older/cheaper 8 bit machines like the Speccy, you had to do everything on the single CPU, which meant most of the time was spent drawing pixels or dealing with sound. On the NES there is a “Picture Processing Unit” or PPU (a forerunner to the modern GPU), and an “Audio Processing Unit” or APU. As in modern consoles and PCs, these free the CPU up to orchestrate a game as a whole, only needing to sporadically update these co-processors when required.

You can’t write code that runs on the PPU or APU, but you can access their memory indirectly via registers and DMA. One of the nice things we can do if we’re writing a language for a compiling is building optimised calls that do specific jobs. One area I’ve been thinking about a lot is sprites – the 64 8×8 tiles that the PPU draws over the background tiles to provide you with animated characters.

spriteemu
Our sprite testing playpen using graphics plundered from Ys II: Ancient Ys Vanished.

The sprites are controlled by 256 bytes of memory that you copy (DMA) from the CPU to the PPU each frame. There are 4 bytes per sprite – 2 for x/y position, 1 for the pattern id and another for color and flipping control attributes. Most games made use of multiple sprites stuck together to get you bigger characters, in the example above there are 4 sprites for each 16×16 pixel character – so it’s handy to be able to group them together.

Heres an example of the the compiler code generation to produce the 6502 assembly needed to animate 4 sprites with one command by setting all their pattern IDs in one go – this manipulates memory which is later sent to the PPU.

(define (emit-animate-sprites-2x2! x)
  (append
   (emit-expr (list-ref x 2)) ;; compiles the pattern offset expression (leaves value in register a)
   (emit "pha")               ;; push the resulting pattern offset onto the stack
   (emit-expr (list-ref x 1)) ;; compile the sprite id expression (leaves value in a again)
   (emit "asl")               ;; *=2 (shift left)      
   (emit "asl")               ;; *=4 (shift left) - sprites are 4 bytes long, so = address
   (emit "tay")               ;; store offset calculation in y
   (emit "iny")               ;; +1 to get us to the pattern id byte position of the first sprite
   (emit "pla")               ;; pop the pattern memory offset back from the stack
   (emit "sta" "$200,y")      ;; sprite data is stored in $200, so add y to it for the first sprite
   (emit "adc" "#$01")        ;; add 1 to a to point to the next pattern location
   (emit "sta" "$204,y")      ;; write this to the next sprite (+ 4 bytes)
   (emit "adc" "#$0f")        ;; add 16 to a to point to the next pattern location
   (emit "sta" "$208,y")      ;; write to sprite 2 (+ 8 bytes)
   (emit "adc" "#$01")        ;; add 1 to a to point to the final pattern location
   (emit "sta" "$20c,y")))    ;; write to sprite 4 (+ 12 bytes)

The job of this function is to return a list of assembler instructions which are later converted into machine code for the NES. It compiles sub-expressions recursively where needed and (most importantly) maintains register state, so the interleaved bits of code don't interfere with each other and crash. (I learned about this stuff from Abdulaziz Ghuloum's amazing paper on compilers). The stack is important here, as the pha and pla push and pop information so we can do something completely different and come back to where we left off and continue.

The actual command is of the form:

(animate-sprites-2x2 sprite-id pattern-offset)

Where either arguments can be sub-expressions of their own, eg.:

(animate-sprites-2x2 sprite-id (+ anim-frame base-pattern))

This code uses a couple of assumptions for optimisation, firstly that sprite information is stored starting at address $200 (quite common on the NES as this is the start of user memory, and maps to a specific DMA address for sending to the PPU). Secondly there is an assumption how the pattern information in memory is laid out in a particular way. The 16 byte offset for the 3rd sprite is simply to allow the data to be easy to see in memory when using a paint package, as it means the sprites sit next to each other (along with their frames for animation) when editing the graphics:

spritepatternoffset

You can find the code and documentation for this programming language on gitlab.

Artificially evolved camouflage

As the egglab camouflage experiment continues, here are some recent examples after 40 or so generations. If you want to take part in a newer experiment, we are currently seeing if a similar approach can evolving motion dazzle camouflage in Dazzle Bug.

Each population of eggs is being evolved against a lot of background images, so it’s interesting to see the different strategies in use – it seems like colour is one of the first things to match, often with some dazzle to break up the outline. Later as you can see in some of these examples, there is some quite accurate background matching happening.

It’s important to say that all of this is done entirely by the perception from tens of thousands of people playing the game – there is no analysis of the images at any point.

022

020

019

018

016

012

010

009

005

004

Red King: Host/Parasite co-evolution citizen science

A new project begins, on the subject of ecology and evolution of infectious disease. This one is a little different from a lot of Foam Kernow’s citizen science projects in that the subject is theoretical research – and involves mathematical simulations of populations of co-evolving organisms, rather than the direct study of real ones in field sites etc.

The simulation, or model, we are working with is concerned with the co-evolution of parasites and their hosts. Just as in more commonly known simulations of predators and prey, there are complex relationships between hosts and parasites – for example if parasites become too successful and aggressive the hosts start to die out, in turn reducing the parasite populations. Hosts can evolve to resist infection, but this has an overhead that starts to become a disadvantage when most of a population is free of parasites again.

graph
Example evolution processes with different host/parasite trade-offs.

Over time these relationships shift and change, and this happens in different patterns depending on the starting conditions. Little is known about the categorisation of these patterns, or even the range of relationships possible. The models used to simulate them are still a research topic in their own right, so in this project we are hoping to explore different ways people can both control a simulation (perhaps with an element of visual live programming), and also experience the results in a number of ways – via a sonifications, or game world. The eventual, ambitious aim – is to provide a way for people to feedback their discoveries into the research.

sketch

Hungry birds citizen science at the Paris Natural History Museum

Some photos of Mónica Arias running her “Hungry Birds” butterfly catching experiment at the Muséum national d’Histoire naturelle in Paris.

comp

The Museum’s internet capability was challenging, so we ran the game server on a Raspberry Pi with an adhoc wifi and provided the data collection ourselves. The project is concerned with analysing pattern recognition and behaviour in predators. We’re using ten different wing patterns (or morphs), and assigning one at random to be the toxic one, and looking at how long it takes people to learn which are edible.

hungrybirds2

AI as puppetry, and rediscovering a long forgotten game.

AI in games is a hot topic at the moment, but most examples of this are attempts to create human-like behaviour in characters – a kind of advanced puppetry. These characters are also generally designed beforehand rather than reacting and learning from player behaviour, let alone being allowed to adapt in an open ended manner.

geo-6
Rocketing around the gravitational wells.

Geo was an free software game I wrote around 10 years ago which I’ve recently rediscovered. I made it a couple of years after I was working for William Latham’s Computer Artworks – and was obviously influenced by that experience. At the time it was a little demanding for graphics hardware, but it turns out the intervening years processing power has caught up with it.

This is a game set in a large section of space inhabited by lifeforms comprised of triangles, squares and pentagons. Each lifeform exerts a gravitational pull and has the ability to reproduce. It’s structure is defined by a simple genetic code which is copied to it’s descendants with small errors, giving rise to evolution. Your role is to collect keys which orbit around gravitational wells in order to progress to the next level, which is repopulated by copies of the most successful individuals from the previous level.

A simple first generation lifeform.
A simple first generation lifeform.

Each game starts with a random population, so the first couple of levels are generally quite simple, mostly populated by dormant or self destructive species – but after 4 or 5 generations the lifeforms start to reproduce, and by level 10 a phenotype (or species) will generally have emerged to become an highly invasive conqueror of space. It becomes an against the clock matter to find all the keys before the gravitational effects are too much for your ship’s engines to escape, or your weapons to ‘prune’ the structure’s growth.

I’ve used similar evolutionary strategies in much more recent games, but they’ve required much more effort to get the evolution working (49,000 players have now contributed to egglab’s camouflage evolution for example).

A well defended 'globular' colony - a common species to evolve.
A well defended 'globular' colony – a common phenotype to evolve.

What I like about this form of more humble AI (or artificial life) is that instead of a program trying to imitate another lifeform, it really just represents itself – challenging you to exist in it’s consistent but utterly alien world. I’ve always wondered why the dominant post-human theme of sentient AI was a supercomputer deliberately designed usually by a millionaire or huge company. It seems to me far more likely that some form of life will arise – perhaps even already exists – in the wild variety of online spambots and malware mainly talking to themselves, and will be unnoticed – probably forever, by us. We had a little indication of this when the facebook bots in the naked on pluto game started having autonomous conversations with other online spambots on their blog.

A densely packed 'crystalline' colony structure.
A densely packed 'crystalline' colony structure.

Less speculatively, what I’ve enjoyed most about playing this simple game is exploring and attempting to shape the possibilities of the artificial life while observing and categorising the common solutions that emerge during separate games – cases of parallel evolution. I’ve tweaked the between-levels fitness function a little, but most of the evolution tends to occur ‘darwinistically’ while you are playing, simply the lifeforms that reproduce most effectively survive.


An efficient and highly structured 'solar array' phenotype which I’ve seen emerge twice with different genotypes.

You can get the updated source here, it only requires GLUT and ALUT (a cross platform audio API). At one time it compiled on windows, and should build on OSX quite easily – I may distribute binaries at some point if I get time.

geo-5
A ‘block grid’ phenotype which is also common.

New camouflage pattern engine

One of the new projects we have at foam kernow is a ambitious new extension of the egglab player driven camouflage evolution game with Laura Kelley and Anna Hughes at Cambridge Uni.

As part of this we are expanding the patterns possible with the HTML5 canvas based pattern synthesiser to include geometric designs. Anna and Laura are interested in how camouflage has evolved to disrupt perception of movement so we need a similar citizen science game system as the eggs, but with different shapes that move at different speeds.

Here are some test mutations of un-evolved random starting genomes:

3

4

2

This is an example pattern program:

5gen