Remote Usability Testing using Wonderland

February 24, 2011

Kapil Chalil Madathil and Dr. Joel Greenstein conducted an interesting study analyzing the feasibility of using Open Wonderland for synchronous remote usability testing.

Kapil is currently a doctoral student at Clemson University working with Dr. Joel S. Greenstein. Dr. Greenstein is an Associate Professor in the Department of Industrial Engineering and the Director of the Human-Computer Systems Laboratory at Clemson University.

Here they share some excerpts from their work that will be published at the CHI 2011 conference in Vancouver, Canada.

CHI 2011 Preview:  A New Perspective to Remote Usability Testing using Wonderland

The emergence of high speed internet technologies has resulted in the concept of the global village and next generation web applications addressing its needs. In such a scenario where usability evaluators, developers and prospective users are located in different countries and time zones, conducting a traditional lab usability evaluation creates challenges both from the cost and logistical perspectives. These concerns led to research on remote usability evaluation, with the user and the evaluators separated. However, remote testing lacks the immediacy and sense of “presence” desired to support a collaborative testing process. Moreover, managing inter-personal dynamics across cultural and linguistic barriers may require approaches sensitive to the cultures involved.  Three-dimensional (3D) virtual world applications may address some of these concerns.

Collaborative engineering was redefined when Open Wonderland integrated high fidelity voice-based communication, immersive audio and screen-sharing tools into virtual worlds. Such 3D virtual worlds mirror the collaboration among participants and experts when all are physically present, potentially enabling usability tests to be conducted more effectively when the facilitator and participant are located in different places.

We developed a virtual three-dimensional usability testing laboratory using the Open Wonderland toolkit.

We then conducted a study to compare the effectiveness of synchronous usability testing in a 3D virtual usability testing lab with two other synchronous usability testing methods: the traditional lab approach and WebEx, a web-based conferencing and screen sharing approach.

The study was conducted with 48 participants in total, 36 test participants and 12 test facilitators. The test participants were asked to complete 5 tasks on a simulated e-commerce website. The three methodologies were compared with respect to the following dependent variables: the time taken to complete the tasks; the usability defects identified; the severity of these usability defects; and the subjective ratings from NASA-TLX (NASA Task Load Index), presence and post-test subjective satisfaction questionnaires.

The three methodologies agreed closely in terms of the total number of defects identified, the number of high severity defects identified and the time taken to complete the tasks. However, there was a significant difference in the workload experienced by the test participants and facilitators, with the traditional lab condition imposing the least and the virtual lab and the WebEx conditions imposing similar levels. It was also found that the test participants experienced greater involvement and a more immersive experience in the virtual world condition than in the WebEx condition. The ratings for the virtual world condition were not significantly different from those in the traditional lab condition.  The results of this study suggest that participants were productive and enjoyed the virtual lab condition, indicating the potential of a virtual world based approach as an alternative to conventional approaches for synchronous usability testing.

We will be presenting the full details of our study at CHI 2011 in Vancouver, Canada.

Hope to see you there!!

Kapil Chalil Madathil and Dr. Joel S. Greenstein


Follow

Get every new post delivered to your Inbox.

Join 56 other followers

%d bloggers like this: