Contact

Nina Flad

Address: Spemannstr. 38
72076 Tübingen
Room number: 105
Phone: +49 7071 601 602
Fax: +49 7071 601 616
E-Mail: nina.flad
   
Picture of Flad, Nina

Nina Flad

Position: PhD Student

Supervised by: Lewis Chuang, PhD

 

Research
I am interested in how humans seek out and process visual information from different sources during manual steering. For example, controlling a car for its heading and speed requires us to scan our environment constantly by moving our eyes; we have to monitor our distance to the car in front of us as well as the road curvature and peripheral road-signs. I am further interested in understanding how the characteristics of these sources, such as their updating frequency and spatial content, influence visual scanning behavior.

 

For this purpose, I record EEG and eye-movement behavior during visual scanning tasks. Eye-movements can inform us about the user’s gaze behavior, that is, what the user is looking at and, thus, what information he tries to receive from his surroundings. EEG can give insights into cognitive “properties” such as attention and workload, namely the extent to which the information that is fixated on is perceived and processed.

Introduction
Humans move their eyes to constantly gather visual information from their surroundings. Conversely speaking, information can be updated at different rates across the visual scene and at different perceptual qualities. When piloting an aerial vehicle, optic flow from the outside world might indicate ego-speed more frequently than an aerial speed indicator instrument. However, it is less likely to present the speed as accurately as the instrument itself. Thus, the human observer has to adapt his information-seeking behavior according to the information-providing characteristics of his environment and in accordance to task demands.

 

Goal
I want to evaluate how different properties of information channels (e.g. updating frequencies, relevance for a main task or couplings between channels) during a control task can influence information retrieval and processing by the operator.

 

Methods
I collect EEG, EOG and (remote infra-red) eye-tracker data during a visual scanning task. A typical visual scanning task requires participants to look at multiple regions of interest and to respond to the appearance of visual targets. Given that movement of the eyes' (dipoles) can influence EEG measurements, it is crucial to correct for these artifacts by regression models or independent component analysis.

 


Some EEG features can reflect attention and workload. Based on this, EEG data is segmented to the onset of eye-movement events to indicate how fixated information is cognitively processed in the brain. Currently, I investigate event-related potentials from the onset of fixation to a target to gain insight into the processing that takes place post-fixation.

 

Results and findings
Using Gaussian mixture modeling, I have demonstrated that EOG data effectively discriminate for up to 4 different regions of interest, with a spatial accuracy that is comparable to video-based tracking. This is a crucial first step in segmenting EEG according to the onset of fixated stimuli.

Nina is a PhD student at the Max Planck Institute for Biological Cybernetics. She has a Master’s degree in Neural Information Processing (Graduate Training Centre of Neuroscience, Tübingen) and a Bachelor's degree in Bioinformatics.

Preferences: 
References per page: Year: Medium:

  
Show abstracts

Conference papers (4):

Flad N Person, Ditz JC Person, Schmidt A , Bülthoff HH Person and Chuang LL Person (October-2016) Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering Second Workshop on Eye Tracking and Visualization (ETVIS 2016), -.
Flad N Person, Bülthoff HH Person and Chuang LL Person (August-2015) Combined use of eye-tracking and EEG to understand visual information processing International Summer School on Visual Computing (VCSS 2015), Fraunhofer Verlag, Stuttgart, Germany, 115-124.

Export as:
BibTeX, XML, Pubman, Edoc
Last updated: Tuesday, 18.11.2014