Testing how my face gets expressed in eigenface space as I change my expression. The face on the left is my face synthesised by the trained algorithm. Ideally, there would be no expression on any of the faces in the face database, and my synthesised eigenface would remain the same no matter what silly face I pulled – I want to be able to distinguish between users regardless of their expression. Of course the catch is that people’s appearance is all to do with their normal expression, so it picks up on some, particularly the grimacing for some reason :)
Of course, with a bunch of training images with expression in, it would be possible to track expression – which I might try soon.