Gaze Estimation in the Dark with Generative Adversarial Networks

Jung Hwa Kim, Jin Woo Jeong

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

In this paper, we propose to utilize generative adversarial networks (GANs) to achieve successful gaze estimation in interactive multimedia environments with low light conditions such as a digital museum or exhibition hall. The proposed approach utilizes a GAN to enhance user images captured under low-light conditions, thereby recovering missing information for gaze estimation. The recovered images are fed into the CNN architecture to estimate the direction of user gaze. The preliminary experimental results on the modified MPIIGaze dataset demonstrated an average performance improvement of 6.6 under various low light conditions, which is a promising step for further research.

Original languageEnglish
Title of host publicationProceedings ETRA 2020 Adjunct - ACM Symposium on Eye Tracking Research and Applications, ETRA 2020
EditorsStephen N. Spencer
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450371353
DOIs
StatePublished - 2 Jun 2020
Event2020 ACM Symposium on Eye Tracking Research and Applications, ETRA 2020 - Stuttgart, Germany
Duration: 2 Jun 20205 Jun 2020

Publication series

NameEye Tracking Research and Applications Symposium (ETRA)

Conference

Conference2020 ACM Symposium on Eye Tracking Research and Applications, ETRA 2020
Country/TerritoryGermany
CityStuttgart
Period2/06/205/06/20

Keywords

  • deep learning
  • GAN
  • Gaze estimation
  • low-light environment

Fingerprint

Dive into the research topics of 'Gaze Estimation in the Dark with Generative Adversarial Networks'. Together they form a unique fingerprint.

Cite this