I thought I’d bring to your attention some work we’ve done to get the Sun SPOT and Wonderland working together. Last year we made available the source code for users to be able to use a Sun SPOT as a controller–using the accelerometer in the Sun SPOT to guide your avatar. However, to be frank, the code was very well hidden.
So, we’ve updated the code and added the ability to control some primitive avatar gestures, using the switches on a Sun SPOT. I can imagine that this doesn’t necessarily sound terribly impressive, but… the reason we’re pursuing this is that some of our colleagues in the MiRTLE project at the University of Essex in the UK have been working on connecting a combined thumb-sized ‘bio-sensor’ to a Sun SPOT. Here’s an illustration of an early prototype, connected to a rev B Sun SPOT. (The bio-sensor combines a galvanic skin response sensor, a temperature sensor and an infrared pulse sensor.)
The goal is to use the bio-sensor to sense the user’s emotional state (in terms of arousal and valence) and then use that information to change the appearance/posture/movement/etc of your avatar. And why would we want to do this? Well, we think that one of the current problems with using VWs for education is that there’s no implicit non-verbal communication (other than requiring users to explicitly type emoticons and the like). Our hypothesis is that we can replicate some of this non-verbal communication using this kind of technology.
Oh, and we’ve updated the source code and tidied it up so everyone can use it. Check out the sunSpot directory of the CVS repository in the Wonderland Incubator project. And to see it in action, take a look at the video.
Thanks to our colleagues Xristos Kalkanis and Malcolm Lear in the Department of Computing and Electronic Systems at the University of Essex for their help.