Human Multisensory Perception
Research report (imported) 2004 - Cyberneum
For perceiving the environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. Some sources of sensory information derived from different modalities provide information about the same object property or event. For example, the size of an object can both be seen with the eyes and felt with the hands. This is called redundant sources of sensory information. In this report we will show how such redundant sources of sensory information are used by the human brain in order to interact with the environment in a purposive fashion. Further, we describe which role prior knowledge plays concerning the statistical regularities in the world and how this can affect the process of perception. As a model for describing such somatosensory interactions we apply the Bayesian Decision Theory (BDT).