With the CyberPod we combined virtual reality technologies with moving-base motion simulators. This platform could be moved in six degrees of freedom thanks to its actuated legs.

With the MPI CyberPod we could study the interplay between the human visual, auditory, vestibular and neuromuscular systems. The center piece of the MPI CyberPod was a hexapod Stewart platform (Bosch eMotion 1500) with six degrees of freedom. Mounted on the motion-base was a 2.5 by 2.2 m aluminum platform.

This platform was designed and produced in-house, with a focus on a high dynamic response requiring a high structural stiffness. For visualizations, the platform features a removable projection screen located 1.1 m away from the participant, with a field of view of approximately 95°×53°. Instead of the projection screen, the platform can also be used in combination with our head mounted display systems and optical tracking system to measure head poses and position of the simulator in the room. Auditory stimulation can be provided using either a surround sound system or noise-cancelling headphones.

The characteristics of the motion system have been objectively measured with a standardized approach. The system time-delay was below 20 ms (far below the required 150 ms for flight simulators). The system bandwidth depended on the exact mechanical configuration equipment mounted to the platform, but exceeds 30 Hz for the most basic configuration, consisting of a seat on the platform only.

Examples of fundamental research performed on the CyberPod Simulator so far was a study on accumulation of sensory information over time in the perception of rotation, an fNIRS neuroimaging study on decision making in self-motion perception, and a study on how visual and inertial information are combined in the perception of verticality. The CyberPod simulator has further been used for the development of Motion Cueing Algorithms based on Model Predictive Control methods, and in a project aimed at mitigating motion sickness in future autonomous vehicles, using peripheral visual cues.

Go to Editor View