Avatar Control with Microsoft Kinect

August 1, 2013

By Dominik Alessandri & Christian Wietlisbach
Hochschule Luzern Technik & Architektur

As part of our bachelor thesis, we developed a module called ‘kinect-control’. This module lets you control your avatar by using your body doing gestures.

This module mainly runs on the client but necessary information will be transferred from the server to the client as needed. All the user needs to have is a connected Kinect device and the Kinect SDK Beta 2 installed. This means the module is only available for Windows x86/x64 clients. All other requirements will be shipped from server to client when logging in. This makes it easy for interested parties to use.

To set up a Wonderland server to use the kinect-control module, the administrator needs to change two files on the server:

/.wonderland-server/0.5/run/deploy/wonderland-web-front.war/app/win32/
wonderland_native.jar
/.wonderland-server/0.5/run/deploy/wonderland-web-front.war/app/win64/
wonderland_native.jar

These two files must be replaced. The extended files can be downloaded from

http://147.88.213.71:8080/modules/KinectControl/win32/wonderland_native.jar
http://147.88.213.71:8080/modules/KinectControl/win64/wonderland_native.jar

These new files contain the DLL ‘KinectDLL.dll’ which is necessary for the connection between Open Wonderland and the Kinect.

If you are running this module on the client-side, the first thing you need to do is to connect the Kinect device to your Open Wonderland client. This can be done using the kinect-controller dialog. After installing the kinect-control module, in Wonderland, click on ‘Window -> Kinect Controller’:

kinect_display_window

The dialog contains two buttons: ‘Start Kinect’ and ‘Stop Kinect’:

kinect_settings
If you have a Kinect device connected to your PC, you can click the button ‘Start Kinect’. After a while, your Kinect Device moves to the initial position and the window displays ‘Connection Works’:

kinect_settings_connection_works

You can adjust the angle of your Kinect device by sliding the slider to the desired position:

kinect_settings_angle_slider

Now you are ready to move your avatar by using your body. Place yourself in front of the Kinect device and control your avatar as follows:

  • Walk: Just move your legs up and down
  • Turn right: Hold right arm to the right side
  • Turn left: Hold left arm to the left side
  • Fly up: Hold right arm up
  • Fly down: Hold left arm up

It is possible to extend the gestures recognized by this module. For this, you need to modify the file ‘gesturesBDA.txt’ located in ‘kinect-control-client.jar’ inside ‘kinect-control.jar’ using the software ‘KinectDTW‘. After this file contains your new gesture, you need to map this gesture to a keyboard-input.

The file ‘keyMapping.txt’ contains the allocations from gestures to keyboard-inputs. It is located on server in /.wonderland-server/0.5/run/content/modules/installed/kinect-control/client/. The structure of the file is as follows:

[Name of gesture]=[isMouse[0/1]];[keyCode1[decimal]];[keyCode2[decimal]];[time[millis]]

Example 1:

@Run=0;16;87;2500
Description:
When gesture @Run is recognized, press key 16 (shift) and 87 (w) 2.5 seconds long.

Example 2:

@Walk=0;87;0;3000
Description:
When gesture @Walk is recognized, press key 87 (w) for 3 seconds long.

For a list of all keycodes you can consult http://www.cambiaresearch.com/articles/15/javascript-char-codes-key-codes/.

You will need the following files to get the kinect module running:

A video of the running module can be seen on YouTube:


Code contributions

January 7, 2011
New Gesture HUD

Cheering for the new gesture HUD

After a week on vacation, I was excited to find an unexpected holiday gift in my inbox: code contributions.

From fixes for typos in the web UI, to animation updates, voice bridge updates, and even a whole new gesture HUD (!), my inbox was brimming with patches.

I’ve tried to acknowledge contributions in the various change logs (which you can find here, here and here), but I wanted to say a more public “thank you!” to all the contributors.

I would also like to add a word of encouragement to people who have already made changes to Open Wonderland code: please send us your patches! Whether they are simple or complicated, related to the core or to modules, we would all love to see the results of your hard work.

If you have a change you would like to contribute, I recommend you file a request for enhancement in the issue tracker, and attach your patch there to make it available to everyone. This will also allow the module owners to review the patch for inclusion in the Wonderland trunk (we just ask you to sign our contributor agreement).

Happy patching!

Jonathan Kaplan, Open Wonderland Architect


Mapping Emotions to Avatar Gestures

July 5, 2010

Here is another project from the Spatial Media Group at the University of Aizu brought to us by the same team responsible for the Wonderland-CVE Bridge described in a previous Wonderblog post.

Mapping Emotions to Avatar Gestures

By Rasika Ranaweera, Doctoral student, University of Aizu, Japan

Representing emotions in collaborative virtual environments is important to make them realistic. To display emotions in such environments, facial expressions of avatars have been previously deployed. Avatars in Wonderland can be animated with limited predefined gestures, but there is no limitation to integrating new animations along with new artwork or combination of existing contents. Our goal is to introduce emotional communication to Wonderland to make it more realistic and natural. In this case existing gestures were mapped rather than introducing new gestures to avatars of the virtual world. We also used user-friendly keywords to trigger gestures of avatars, for example :cheer: triggers “cheer” animation.

Avatar cheering image

Avatars in a conversation with emotion embedded textchat

The following table shows emoticons to gestures/animations mapping in our system.

Emoticon
Emotion
Representation/Animation
:S Anger TakeDamage
:( Dislike No
3:( Fear TakeDamage
:D Joy Laugh
{:( Sadness GoHome
:O Surprise Cheer

Watch the demo video to see the gesture mapping in action.

This project has been written up in the following paper:

Senaka Amarakeerthi, Rasika Ranaweera, Michael Cohen, and Nicholas Nagel. Mapping selected emotions to avatar gesture. In IWAC: 1st Int. Workshop on Aware Computing, Japan Society for Fuzzy Theory and Intelligent Informatics. Sep. 2009, Aizu-Wakamatsu.


Follow

Get every new post delivered to your Inbox.

Join 54 other followers

%d bloggers like this: