PrincetonComputer SciencePIXL GroupPublications → [Xu et al. 2015] Local Access
TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking

arXiv preprint, May 2015

Pingmei Xu, Krista A Ehinger, Yinda Zhang,
Adam Finkelstein, Sanjeev R. Kulkarni, Jianxiong Xiao
Abstract

Traditional eye tracking requires specialized hardware, which means collecting gaze data from many observers is expensive, tedious and slow. Therefore, existing saliency prediction datasets are order-of-magnitudes smaller than typical datasets for other vision recognition tasks. The small size of these datasets limits the potential for training data intensive algorithms, and causes overfitting in benchmark evaluation. To address this deficiency, this paper introduces a webcam-based gaze tracking system that supports large-scale, crowdsourced eye tracking deployed on Amazon Mechanical Turk (AMTurk). By a combination of careful algorithm and gaming protocol design, our system obtains eye tracking data for saliency prediction comparable to data gathered in a traditional lab setting, with relatively lower cost and less effort on the part of the researchers. Using this tool, we build a saliency dataset for a large number of natural images. We will open-source our tool and provide a web server where researchers can upload their images to get eye tracking results from AMTurk.
Citation

Pingmei Xu, Krista A Ehinger, Yinda Zhang, Adam Finkelstein, Sanjeev R. Kulkarni, and Jianxiong Xiao.
"TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking."
arXiv:1504.06755, April 2015.

BibTeX

@techreport{Xu:2015:TCS,
   author = "Pingmei Xu and Krista A Ehinger and Yinda Zhang and Adam Finkelstein
      and Sanjeev R. Kulkarni and Jianxiong Xiao",
   title = "{TurkerGaze}: Crowdsourcing Saliency with Webcam based Eye Tracking",
   institution = "arXiv preprint",
   year = "2015",
   month = apr,
   number = "1504.06755"
}