본문

서브메뉴

Robust and Accurate Eye-gaze Tracking and Its Applications
Sommaire Infos
Robust and Accurate Eye-gaze Tracking and Its Applications
자료유형  
 학위논문
Control Number  
0015490812
International Standard Book Number  
9781085564632
Dewey Decimal Classification Number  
621
Main Entry-Personal Name  
Wang, Kang.
Publication, Distribution, etc. (Imprint  
[Sl] : Rensselaer Polytechnic Institute, 2019
Publication, Distribution, etc. (Imprint  
Ann Arbor : ProQuest Dissertations & Theses, 2019
Physical Description  
201 p
General Note  
Source: Dissertations Abstracts International, Volume: 81-02, Section: B.
General Note  
Advisor: Ji, Qiang.
Dissertation Note  
Thesis (Ph.D.)--Rensselaer Polytechnic Institute, 2019.
Restrictions on Access Note  
This item must not be sold to any third party vendors.
Summary, Etc.  
요약Eye-gaze plays a crucial role in our everyday life. It is an effective way to perceive the world around us, express our intent and emotions, and communicate with each other. Eye-gaze has been applied to a wide range of fields, from advertising and biometrics to gaming industry and medical diagnoses. Despite the significant progress, existing eye-gaze tracking systems often require complex hardware setups, as well as significant user involvement, and are limited to constrained environments. The goal of this thesis is to develop advanced eye-gaze tracking technologies to overcome these barriers, so that eye-gaze tracking can be performed accurately and non-intrusively in a natural environment.First, existing eye-gaze tracking systems typically require an explicit personal calibration that not only degrades the user experience, but also makes it difficult to perform natural eye-gaze tracking. To eliminate this requirement, we introduce a novel approach that combines a top-down saliency map with a bottom-up gaze distribution map. By minimizing the KL-divergence between the two maps, personal calibration can be implicitly performed without the user's explicit collaboration. Next, we further eliminate the usage of saliency map by leveraging several constraints during natural eye-gaze tracking to estimate the personal eye parameters.Second, existing eye-gaze tracking systems often require complex hardware setups, including infrared lights, stereo cameras, or dedicated systems. In this thesis, we introduce two systems without complex hardware setups and infrared illuminations. The first one is based on a Kinect sensor. We propose a 3D head-eye model to effectively recover the 3D eye-gaze with the help of the depth information from Kinect. The second system only requires an ordinary web camera. With the proposed 3D eye-face model, we can estimate the 3D eye-gaze from detected 2D facial landmarks.Third, existing eye-gaze tracking methods suffer from poor generalizations. We propose three methods to address this limitation. The first model encodes eye geometry knowledge with a probabilistic graphical model and captures the relationship between eye-gaze and eye shape through a deep neural network. As eye geometry knowledge applies to different subjects under different head poses or environments, the proposed model can therefore achieve better generalization performance. For the second model, we introduce a Bayesian framework that consists of a learning-based landmark estimator and a model-based gaze estimator. The Bayesian framework allows predicting landmarks with multiple sets of model parameters and hence can further improve the generalization performance. The third model leverages on the idea of Bayesian adversarial learning, where the learned model from source domain can better adapt to new domains like new subjects, head poses and environments.Finally, we propose to incorporate eye movement dynamics to help improve existing static eye-gaze tracking. By analyzing the patterns of different types of eye movements, including fixation, saccade and smooth pursuit, we are able to combine these top-down gaze transition priors with our bottom-up gaze predictions to enable robust and accurate online eye-gaze tracking.
Subject Added Entry-Topical Term  
Artificial intelligence
Subject Added Entry-Topical Term  
Computer engineering
Added Entry-Corporate Name  
Rensselaer Polytechnic Institute Electrical Engineering
Host Item Entry  
Dissertations Abstracts International. 81-02B.
Host Item Entry  
Dissertation Abstract International
Electronic Location and Access  
로그인을 한후 보실 수 있는 자료입니다.
Control Number  
joongbu:566912
New Books MORE
최근 3년간 통계입니다.

Info Détail de la recherche.

  • Réservation
  • 캠퍼스간 도서대출
  • 서가에 없는 책 신고
  • My Folder
Matériel
Reg No. Call No. emplacement Status Lend Info
TQ0006932 T   원문자료 열람가능/출력가능 열람가능/출력가능
마이폴더 부재도서신고

* Les réservations sont disponibles dans le livre d'emprunt. Pour faire des réservations, S'il vous plaît cliquer sur le bouton de réservation

해당 도서를 다른 이용자가 함께 대출한 도서

Related books

Related Popular Books

도서위치