Katharina Dobs

E-Mail: katharina.dobs
Bild von Dobs, Katharina

Katharina Dobs

Position: Gastwissenschaftler

Research Group: RECCAT Supervisors: Isabelle Bülthoff and Johannes Schultz

Ich bin an der menschlichen Verarbeitung von Gesichtsbewegungen interessiert und vor allem daran, welche Rolle diese fuer die Identitaet einer Person spielen. Die Gesichter, die uns taeglich begegnen, sind typischerweise in Bewegung. Daher wird angenommen, dass Gesichtsbewegungen zu einem gewissen Anteil bei der Verarbeitung von Identitaet beteiligt sind. Die Forschung von dynamischen Gesichtern, im Vergleich zu statischen, ist jedoch ein relativ neuer Bereich, in dem es noch viele offene Fragen gibt.

Im Rahmen meiner Promotion habe ich die Rolle von Gesichtsbewegung und ihre Interaktion mit Gesichtsform bei der Gesichtswahrnehmung untersucht. Hierbei habe ich neurowissenschaftliche Methoden, vor allem Psychophysik und neuronale Bildgebung, angewandt.

Investigating identity information in facial motion


The faces we encounter everyday typically move. Previous studies have shown that facial motion - in addition to facial form - can carry information about the identity of a person [1,2], yet the exact role of facial motion as a cue for identity is still unclear [3].



The overall goal is to understand when and how facial motion contributes to person recognition. We hypothesize that humans’ sensitivity to identity information in facial motion varies depending on the type of facial movement (e.g. basic emotions or conversational expressions). The results shall further advance our understanding of how we perceive faces in real life.



We assessed human observers’ sensitivity to identity information in different types of facial movements. To separate form from motion cues, we used a recent facial motion capture and animation system [4,5] and animated a single avatar head with facial movements recorded from four different actors. The facial movements occurred in three social contexts: (1) emotional (e.g., anger), (2) emotional in a social interaction (e.g., being angry at someone) and (3) social interaction (e.g., saying goodbye to someone). Using a delayed matching-to-sample task (see Fig. 1), we tested in which context human observers can discriminate unfamiliar persons based only on their facial motion.


Fig. 1: The trial procedure of the experiment.

Fig. 1: The trial procedure of the experiment; exemplarily shown for the emotional context. Observers first watched an animation of a facial expression (Sample; e.g., angry), followed by two animations displaying a different facial movement (Matching stimuli; e.g., happy). Observers were asked to choose which of the matching stimuli was performed by the same actor as the sample.



Observers were able to discriminate identities based on emotional facial movements occurring in a social interaction (Fig. 2, middle), but not on basic emotional facial expressions (Fig. 2, left). Sensitivity was highest across non-emotional, speech-related movements occurring in a social interaction (Fig. 2, right).


Fig. 2: Behavioral results.

Fig. 2: Behavioral results. Mean sensitivity (d’) across observers (n = 14) as a function of context. A sensitivity of 0 indicates chance level. Error bars indicate 95% confidence interval (CI).












Our findings reveal that human observers can recognize unfamiliar persons from conversational and speech-related movements but not from the way they perform basic emotional facial expressions. We hypothesize that these differences are due to how these movements are executed: basic emotions are performed quite stereotypically, whereas conversational and speech-related movements are performed more idiosyncratically.



[1] Hill H and Johnston A (2001). Categorizing sex and identity from the biological motion of faces. Current Biology 11 880-885.

[2] Knappmeyer B, Thornton IM and Bülthoff HH (2003). The use of facial motion and facial form during the processing of identity. Vision Research 43 1921-1936.

[3] O’Toole A.J, Roark DA and Abdi H (2002). Recognizing moving faces: A psychological and neural synthesis. Trends in Cognitive Sciences 6 261–266.

[4] Curio C, Breidt M, Kleiner M, Vuong QC, Giese MA and Bülthoff HH (2006). Semantic 3D motion retargeting for facial animation. 3rd Symposium on Applied Perception in Graphics and Visualization (APGV '06), ACM Press, New York, NY, USA, 77-84.

[5] Dobs K, Bülthoff I, Breidt M, Vuong QC, Curio C and Schultz J (2014). Quantifying human sensitivity to spatio-temporal information in dynamic faces. Vision Research 100 78-87.

Current Position

since 2015: Postdoctoral Researcher at the Brain and Cognition Research Center (CerCo), CNRS, Toulouse, France. Advisor: Leila Reddy



2010 - 2014: Ph.D. Candidate at Max Planck Institute for Biological Cybernetics, Tübingen, Germany (Dept. Human Perception, Cognition and Action). Advisors: Isabelle Bülthoff, Johannes Schultz

2002 - 2008: Diploma Psychology, Philipps-University Marburg, Germany.Advisors: Frank Rösler, Kerstin Jost

2002 - 2007: Diploma Computer Science, Philipps-University Marburg, Germany. Advisors: Manfred Sommer, David Kämpf


Research and Teaching Experience

2014 - 2015: Postdoctoral researcher / guest scientist at the Max Planck Institute for Biological Cybernetics. Advisor: Isabelle Bülthoff

2013: JSPS fellow at Gardner Research Team, RIKEN BSI, Japan, conducting an fMRI study on attentional modulation of facial motion and form processing.Advisor: Justin Gardner

2011: Supervised Kathryn Bonnen (Michigan State University) working on "Physical and perceptual analysis of the 3D face database" as an internship

2004 – 2008: Student Research Assistant at the Cognitive Psychophysiology Lab, Philipps-University of Marburg, Germany. Advisor: Frank Rösler

2006: Visiting Research Assistant at the Laboratory of Systems Neurodynamics, University of Virginia, USA. Advisor: William B Levy


Fellowships, Grants and Awards

2015 - 2017: Postdoctoral Fellowship of the German Research Foundation (DFG)

2015: Best Dissertation Award 2015 from the Max Planck Institute for Biological Cybernetics and Förderverein für neurowissenschaftliche Forschung e.V.

2015: Travel award from the German Academic Exchange Service

2014: Invited speaker at the symposium "The perception of faces" in English Lake District, UK, funded by the Rank Prize Funds.

2013: JSPS (Japan Society for the Promoting of Science) Research Fellowship

2012: VSS Student Travel Award winner


Work Experience

2009 - 2010: IT Consultant / Software Engineer at PRODYNA AG, Frankfurt, Germany

2008 - 2009: Freelancer / Software Engineer in London, GB

Referenzen pro Seite: Jahr: Medium:

Zeige Zusammenfassung

Poster (6):

Dobs K Person, Schultz J Person, Bülthoff I Person und Gardner JL (November-10-2013): Attending to expression or identity of dynamic faces engages different cortical areas, 43rd Annual Meeting of the Society for Neuroscience (Neuroscience 2013), San Diego, CA, USA.
Dobs K Person, Bülthoff I Person, Breidt M Person, Vuong QC Person, Curio C Person und Schultz JW Person (August-2013): Quantifying Human Sensitivity to Spatio-Temporal Information in Dynamic Faces, 36th European Conference on Visual Perception (ECVP 2013), Bremen. Germany, Perception, 42(ECVP Abstract Supplement) 197.
Dobs K Person, Bülthoff I Person, Curio C Person und Schultz J Person (August-2012): Investigating factors influencing the perception of identity from facial motion, 12th Annual Meeting of the Vision Sciences Society (VSS 2012), Naples, FL, USA, Journal of Vision, 12(9) 35.
Dobs K Person, Kleiner M Person, Bülthoff I Person, Schultz J Person und Curio C Person (September-2011): Investigating idiosyncratic facial dynamics with motion retargeting, 34th European Conference on Visual Perception, Toulouse, France, Perception, 40(ECVP Abstract Supplement) 115.

Export als:
BibTeX, XML, pubman, Edoc
Last updated: Dienstag, 18.11.2014