Implicit measurements of eye movements help us make stronger inferences about the psychological processes that underlie task completion across a variety of experimental scenarios. Our eye-tracking facilities are suited and can be flexibly tailored to a range of experimental demands.
An eyelink II (2000 Hz, SR Research ) head-mounted eye-tracker is used for experiments that require high temporal resolution (e.g., saccade velocity measurements, micro-saccade detection) as well a high accuracy (~0.3°). It is often used in conjunction with high-frequency CRT and TFT displays (100-120 Hz refresh rate). The observers's head is also supported by a chin-rest to assure high tracking accuracy. This set-up is employed to analyze low-level gaze behavior (e.g., saccade kinematics ) during object recognition and steering tasks.
Remote eye-tracking is employed in the HeliLab using two linked FaceLab (Seeingmachines) eye-trackers. The ovserver's gaze vector is computed by estimating head orientation from transient facial features and pupil orientation with remote cameras (up to 1.5 m, 60 Hz, ~1.0° accuracy). Therefore, this system minimizes scene occlusion, from the eye-tracking wquipment (-> HeliLab). Gaze information can also be computed in real-time, with a latency of 50 msecs, for adaptive systems (e.g. gaze-contingent displays).
A separate remote eye-tracking system is the Tobii T60 XL. This optical system employs corneal reflection tracking together with dark and bright pubpil tracking at a sampling rate of 60 Hz. A chin-rest is not required for accurate tracking (up to 0.5°), which increases the comfort of the observer during experiments,. the complete tracking system is integrated into a high-resolutions 24" TFT display and supports easy stimulus preparation and analysis (more advanced experiments are prepared using E-Prime and MATLAB extensions). This allows for uncompkicated and quick set-up of the system, which is useful for pilot experiments, emonstrations, or student internships. We currently use the Tobii eye-tracker to study gaze behavior during observations of faces.
The BackProjection Large Screen Display laboratory is also equipped with a real-time system for unrestrained gaze-tracking. In-house software (libGaze) manages data from a motion tracking system (VICON®) and a head-mounted eyetracker (Eyelink 2) to compute the user's current gaze vector in real-time, at a sampling rate of 120Hz. This system allows full user mobility and scales to larger display spaces; it has been adopted by our collaborators at Konstanz University to research human-computer interaction on their Powerwall (8 m wide). At the MPI for Biological Cybernetics, this system ahs been used for researching natural scene perception an gaze-assisted interfaces.