Accessing social information during joint action coordination

Understanding human-human interaction is an emerging research field. However, up until now, there has been relatively little empirical evidence regarding the perceptual and cognitive mechanisms involved in such interactions. One interesting aspect of human interaction is the role of sensory feedback during the performance of coordinated tasks. Classically the process of human interaction was assumed to rely mainly on higher-level cognitive processes (e.g. inference) whereas more recent evidence suggests that lower-level perceptual processes may play an important role e.g. [1]. In particular, several pieces of evidence suggest a direct link between an action (e.g. grasping) and the perception of the same action [2]. These findings suggest that information about the interaction partner’s body might play an important role in human interaction.
 
Our goal is to investigate the importance of different sources of sensory information on joint-action performance under conditions that allow good external validity. In particular we are using Virtual Reality (VR) technology which allows for high level of experimental control over visual information and precise measurements, while at the same time providing a high degree of realism and enabling natural behavior.
In a dyadic stretcher-carrying task, we investigated the contributions of different sources of information during joint-action coordination. Using VR allowed us to manipulate the visual information about the task and about the interaction partner. In all conditions subjects physically carried a stretcher through a computer generated maze, which was visible in all conditions. In the three testing conditions subjects either perceived no additional information about either the stretcher or the joint-action partner (invisible condition), visual information about the interaction partner (partner condition) or visual information about the stretcher (stretcher condition). To investigate the effect of these different sources of visual information on joint-action performance we compared participants’ walking trajectories in the abovementioned conditions to their walking trajectories when participants perceived visual information about both the stretcher and the interaction partner (control condition).
The walking trajectories in the experimental conditions and control condition were not significantly different. Hence, visual information about the stretcher and the interaction partner did not affect joint-action behavior.

References

Streuber S Person, Chatziastros A Person, Mohler BJ Person und Bülthoff HH Person (August-2008) Joint and individual walking in an immersive collaborative virtual environment 5th Symposium on Applied Perception in Graphics and Visualization (APGV 2008), ACM, New York, NY, USA, 191.
CiteID: 5552

Export als:
BibTeX, XML, pubman, Edoc
Last updated: Freitag, 05.10.2012