Locomotion Behavior and Avatars in Immersive Virtual Environments

It has been shown that people perform differently in Virtual Environments (VEs) as compared to the real world (at least for egocentric distance judgments). In addition, most immersive VEs currently contain little to no social content due to the complexity in rendering people and dynamic interactive behavior in real-time. In this research we work to increase the content of VE by adding virtual avatars, including by initially investigating the impact of a fully-articulated self-avatar on egocentric distance perception.
Our research aim is to improve the content and realism of immersive VE experiences by studying human perception and behavior.Currently we have conducted two experiments using an egocentric distance judgment paradigm where individuals were asked to view a target location in virtual space and then close their eyes and walk to the previously seen target. In one case we used this paradigm to evaluate biomechanical differences between when people walk towards a seen or previously seen target in the virtual world as compared to the real world. Using a large tracking hall equipped with 16 Vicon Cameras we found that when wearing a head-mounted display (HMD) people walk more slowly, with a shorter stride length and a wider step width [1]. In the second experiment we evaluated the impact of having a fully-articulated self-avatar on subsequent egocentric distance judgments. We found that viewing an articulated self-avatar as compared to a line on the floor indicating your location in space improved egocentric distance judgments within a HMD-VE (see Figures 1 and 2) [2].
 
For future work we plan to further investigate the reasons for a self-avatar influencing egocentric distance judgments in a VE. Specifically we will investigate the impact of near versus far space and the articulation of the avatar. In addition we will develop a library of animations/avatars which can be used to dynamically add animated avatars into the virtual world. Adding this social content to a VE should allow one to explore many research topics including, but not limited to social behavior, embodied cognition and adaptation of specific motor and social behavior to modified visual information about one’s movements and appearance.

References

Mohler B Person, Bülthoff HH Person, Thompson WB and Creem-Regehr SH (August-2008) A full-body avatar improves egocentric distance judgments in an immersive virtual environment 5th Symposium on Applied Perception in Graphics and Visualization (APGV 2008), ACM Press, New York, NY, USA, 194-197.
CiteID: 5233
Mohler B Person, Campos J Person, Weyel M Person and Bülthoff HH Person (July-2007) Gait parameters while walking in a head-mounted display virtual environment and the real world 13th Eurographics Symposium on Virtual Environments and 10th Immersive Projection Technology Workshop (IPT-EGVE 2007), Eurographics Association, Aire-la-Ville, Switzerland, 85-88.
pdfCiteID: 4544

Export as:
BibTeX, XML, Pubman, Edoc
Last updated: Thursday, 04.10.2012