Mensch-Computer Interaktion und Kognitive Systeme


SimTech Gebäude
Pfaffenwaldring 5a
70569 Stuttgart


Eugenia Komnik
Tel. +49.711.685-60049
Fax +49.711.685-51082
Raum 01.021


Murielle Naud-Barthelmeß
Tel. +49.711.685-60096
Raum 01.031


Weitere Informationen

  • MobileHCI’18 (best paper award): Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors
  • MobileHCI’18 (best paper honourable mention award): The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned
  • ETRA’18: Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets
  • ETRA’18 (best paper award): Error-Aware Gaze-Based Interfaces for Robust Mobile Gaze Interaction
  • ETRA’18: Revisiting Data Normalization for Appearance-Based Gaze Estimation
  • ETRA’18: Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour
  • ETRA’18: A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction
  • ETRA’18 (best presentation award): Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings
  • ETRA’18: Hidden Pursuits: Evaluating Gaze-selection via Pursuits when the Stimulus Trajectory is Partially Hidden
  • CHI’18: Training Person-Specific Gaze Estimators from Interactions with Multiple Devices
  • CHI’18 (best paper honourable mention award): Which one is me? Identifying Oneself on Public Displays
  • CHI’18: Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild
  • Eurographics’18 (best paper honourable mention award): GazeDirector: Fully Articulated Eye Gaze Redirection in Video
  • IUI’18: Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behavior
  • IEEE TPAMI’18: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
  • MUM’17 (best paper honourable mention award): They are all after you: Investigating the Viability of a Threat Model that involves Multiple Shoulder Surfers
  • PACM IMWUT’17 (distinguished paper award): InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
  • PACM IMWUT’17: EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays
  • UIST’17 (best paper honourable mention award): Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
  • UIST’17: EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
  • CVPRW’17: It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
  • CVPR’17 (spotlight presentation): Gaze Embeddings for Zero-Shot Image Classification
  • ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
  • UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
  • UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
  • ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
  • CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
  • UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
  • UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
  • ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
  • CVPR’15: Appearance-Based Gaze Estimation in the Wild
  • Intelligente Benutzerschnittstellen
  • Mobile Mensch-Computer Interaktion
  • Algorithmische Nutzer(verhaltens-)modellierung
  • Ubiquitäre und tragbare interaktive Systeme
  • Usable security und privacy
  • Eye tracking

In der Lehre trägt die Abteilung zur Ausbildung in den Bachelor- und Master-Programmen im Bereich der Mensch-Computer Interaktion bei.

Aktuelles Lehrveranstaltungsangebot in C@MPUS

Alle Mitarbeiter dieser Abteilung.
  • ERC Starting Grant "ANTICIPATE: Anticipatory Human-Computer Interaction (2019-2023)
  • Intel Visual Computing Institute "Perceptual Rendering for Immersive Displays" (2016-2017)
  • Japan Science and Technology Agency (JST) CREST "Collective Visual Sensing" (2014-2020)