Mensch-Computer-Interaktion und Kognitive Systeme

Leitung: Prof. Dr. Andreas Bulling
SimTech-Gebäude, Pfaffenwaldring 5a, 70569 Stuttgart

  • ETRA'20 (best paper award): Combining Gaze Estimation and Optical Flow for Pursuits Interaction

  • ETRA'19 (best paper award): Privacy-Aware Eye Tracking Using Differential Privacy
  • ETRA'19 (best video award): PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features
  • ComCo'19 (best poster award): Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks 
  • MobileHCI’18 (best paper award): Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors
  • MobileHCI’18 (best paper honourable mention award): The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned
  • ETRA’18: Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets
  • ETRA’18 (best paper award): Error-Aware Gaze-Based Interfaces for Robust Mobile Gaze Interaction
  • ETRA’18: Revisiting Data Normalization for Appearance-Based Gaze Estimation
  • ETRA’18: Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour
  • ETRA’18: A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction
  • ETRA’18 (best presentation award): Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings
  • ETRA’18: Hidden Pursuits: Evaluating Gaze-selection via Pursuits when the Stimulus Trajectory is Partially Hidden
  • CHI’18: Training Person-Specific Gaze Estimators from Interactions with Multiple Devices
  • CHI’18 (best paper honourable mention award): Which one is me? Identifying Oneself on Public Displays
  • CHI’18: Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild
  • Eurographics’18 (best paper honourable mention award): GazeDirector: Fully Articulated Eye Gaze Redirection in Video
  • IUI’18: Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behavior
  • IEEE TPAMI’18: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
  • MUM’17 (best paper honourable mention award): They are all after you: Investigating the Viability of a Threat Model that involves Multiple Shoulder Surfers
  • PACM IMWUT’17: InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
  • PACM IMWUT’17: EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays
  • UIST’17 (best paper honourable mention award): Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
  • UIST’17: EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
  • CVPRW’17: It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
  • CVPR’17 (spotlight presentation): Gaze Embeddings for Zero-Shot Image Classification
  • ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
  • UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
  • UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
  • ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
  • CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
  • UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
  • UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
  • ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
  • CVPR’15: Appearance-Based Gaze Estimation in the Wild

  • Intelligente Benutzerschnittstellen
  • Mobile Mensch-Computer Interaktion
  • Algorithmische Nutzer(verhaltens-)modellierung
  • Ubiquitäre und tragbare interaktive Systeme
  • Usable security und privacy
  • Eye tracking

In der Lehre trägt die Abteilung zur Ausbildung in den Bachelor- und Master-Programmen im Bereich der Mensch-Computer Interaktion bei.

Aktuelles Lehrveranstaltungsangebot in C@MPUS

Postdocs/PhD students
We are always looking for outstanding postdoctoral researchers and prospective PhD students that have an interest in our research areas. You will find information about current projects on our webpages. Maybe you even have your own exciting research project in mind that you could pursue here? There is always room for new ideas and projects and we are happy to discuss them with you. When you are highly motivated and capable to address and solve scientifically difficult problems and you are interested to do research in a young and internationally oriented research team, you should send your application to hcics-application@vis.uni-stuttgart.de. Please do include the following information in your application (preferably in a single pdf document):

  • CV
  • Research statement
  • Full publication list
  • List of referees
  • Postdoc applicants: Your three most interesting research papers (if allowed by publisher, links otherwise)
  • PhDs applicants: Your most interesting research paper (if allowed/applicable)
  • Only PhD applicants: transcripts of master/bachelor program


Bachelor and Master theses
If you are interested in working on a research project with us that results in a Bachelor or Master thesis we would like to hear about it. We usually have a couple of open Bachelor and Master thesis projects available. Our websites should give you an impression about possible thesis topics, we can propose a project based on your preferences. If you already have an idea about a project we are happy to discuss that as well. In either case please send an email to hcics-application@vis.uni-stuttgart.de. For us to get to know you, it would be helpful to also include the following documents. If you are interested in working with a specific researcher in our group, you should state that in your email.

  • CV
  • Transcripts of master/bachelor program (suggested but optional)
  • High school documents (Abitur, suggested but optional)

Furthermore, every semester we organize an open theses event. At the event, you will learn about our overall research goals and areas, the individual ongoing PhD research projects, as well as - most importantly - about open thesis topics currently available in our group.


Hiwi/Student assistant
There are a limited number of student assistant positions in our lab. These jobs usually do include work as part of a research project. If you are interested in joining our group please let us know by writing an email to hcics-hiwi-application@vis.uni-stuttgart.de.

 
  • ERC Starting Grant "ANTICIPATE: Anticipatory Human-Computer Interaction (2019-2023)
  • CRC / Transregio 161 "Quantitative Methods for Visual Computing", Teilprojekt A08 (2019-2023)

  • Excellenzcluster 2075 "Daten-integrierte Simulationswissenschaft (SimTech)", Projektnetzwerk 7 "Adaptive Simulation and Interaction" (2019-2023)
  • Intel Visual Computing Institute "Perceptual Rendering for Immersive Displays" (2016-2017)

  • Japan Science and Technology Agency (JST) CREST "Collective Visual Sensing" (2014-2020)

Team

Kontakt

Dieses Bild zeigt Andreas Bulling

Andreas Bulling

Prof. Dr.

Professor für Mensch-Computer-Interaktion und Kognitive Systeme

Dieses Bild zeigt Daniela  Milanese

Daniela Milanese

 

Administrative Assistenz

Zum Seitenanfang