Human-Computer Interaction and Cognitive Systems

Address

SimTech Building
Pfaffenwaldring 5a
70569 Stuttgart

Secretary

Eugenia Komnik
E-Mail
Tel. +49.711.685-60049
Room 01.021

Financial management

Murielle Naud-Barhtelmeß
E-Mail
Phone +49.711.685-60069
Room 01.031

 

More information

  • MobileHCI’18 (best paper award): Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors
  • MobileHCI’18 (best paper honourable mention award): The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned
  • ETRA’18: Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets
  • ETRA’18 (best paper award): Error-Aware Gaze-Based Interfaces for Robust Mobile Gaze Interaction
  • ETRA’18: Revisiting Data Normalization for Appearance-Based Gaze Estimation
  • ETRA’18: Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour
  • ETRA’18: A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction
  • ETRA’18 (best presentation award): Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings
  • ETRA’18: Hidden Pursuits: Evaluating Gaze-selection via Pursuits when the Stimulus Trajectory is Partially Hidden
  • CHI’18: Training Person-Specific Gaze Estimators from Interactions with Multiple Devices
  • CHI’18 (best paper honourable mention award): Which one is me? Identifying Oneself on Public Displays
  • CHI’18: Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild
  • Eurographics’18 (best paper honourable mention award): GazeDirector: Fully Articulated Eye Gaze Redirection in Video
  • IUI’18: Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behavior
  • IEEE TPAMI’18: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
  • MUM’17 (best paper honourable mention award): They are all after you: Investigating the Viability of a Threat Model that involves Multiple Shoulder Surfers
  • PACM IMWUT’17 (distinguished paper award): InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
  • PACM IMWUT’17: EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays
  • UIST’17 (best paper honourable mention award): Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
  • UIST’17: EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
  • CVPRW’17: It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
  • CVPR’17 (spotlight presentation): Gaze Embeddings for Zero-Shot Image Classification
  • ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
  • UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
  • UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
  • ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
  • CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
  • UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
  • UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
  • ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
  • CVPR’15: Appearance-Based Gaze Estimation in the Wild
All people in this department.
  • Intelligent user interfaces
  • Mobile human-computer interaction
  • Computational user (behaviour) modelling
  • Ubiquitous and wearable interactive systems
  • Usable security and privacy
  • Eye tracking
Postdocs/PhD students
We are always looking for outstanding postdoctoral researchers and prospective PhD students that have an interest in our research areas. You will find information about current projects on our webpages. Maybe you even have your own exciting research project in mind that you could pursue here? There is always room for new ideas and projects and we are happy to discuss them with you. When you are highly motivated and capable to address and solve scientifically difficult problems and you are interested to do research in a young and internationally oriented research team, you should send your application to hcics-application@vis.uni-stuttgart.de. Please do include the following information in your application (preferably in a single pdf document):
  • CV
  • Research statement
  • Full publication list
  • List of referees
  • Postdoc applicants: Your three most interesting research papers (if allowed by publisher, links otherwise)
  • PhDs applicants: Your most interesting research paper (if allowed/applicable)
  • Only PhD applicants: transcripts of master/bachelor program

Bachelor and Master theses
If you are interested in working on a research project with us that results in a Bachelor or Master thesis we would like to hear about it. We usually have a couple of open Bachelor and Master thesis projects available. Our websites should give you an impression about possible thesis topics, we can propose a project based on your preferences. If you already have an idea about a project we are happy to discuss that as well. In either case please send an email to hcics-application@vis.uni-stuttgart.de. For us to get to know you, it would be helpful to also include the following documents. If you are interested in working with a specific researcher in our group, you should state that in your email.
  • CV
  • Transcripts of master/bachelor program (suggested but optional)
  • High school documents (Abitur, suggested but optional)

Hiwi/Student assistant
There are a limited number of student assistant positions in our lab. These jobs usually do include work as part of a research project. If you are interested in joining our group please let us know by writing an email to hcics-hiwi-application@vis.uni-stuttgart.de.

We contribute teaching to the bachelor and master programs in computer science modules related to human-computer interaction.

Currently offered classes in C@MPUS

  • ERC Starting Grant "ANTICIPATE: Anticipatory Human-Computer Interaction (2019-2023)
  • Intel Visual Computing Institute "Perceptual Rendering for Immersive Displays" (2016-2017)
  • Japan Science and Technology Agency (JST) CREST "Collective Visual Sensing" (2014-2020)