Abstract is missing.
- Aggregate gaze visualization with real-time heatmapsAndrew T. Duchowski, Margaux M. Price, Miriah Meyer, Pilar Orero. 13-20 [doi]
- A method to construct an importance map of an image using the saliency map model and eye movement analysisAkira Egawa, Susumu Shirayama. 21-28 [doi]
- Measuring and visualizing attention in space with 3D attention volumesThies Pfeiffer. 29-36 [doi]
- Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systemsKai Essig, Daniel Dornbusch, Daniel Prinzhorn, Helge Ritter, Jonathan Maycock, Thomas Schack. 37-44 [doi]
- Eye tracker data quality: what it is and how to measure itKenneth Holmqvist, Marcus Nyström, Fiona Mulvey. 45-52 [doi]
- A probabilistic approach for the estimation of angle kappa in infantsDmitri Model, Moshe Eizenman. 53-58 [doi]
- Augmenting the robustness of cross-ratio gaze tracking methods to head movementFlavio Luiz Coutinho, Carlos Hitoshi Morimoto. 59-66 [doi]
- Impact of subtle gaze direction on short-term spatial information recallReynold J. Bailey, Ann McNamara, Aaron Costello, Srinivas Sridharan, Cindy Grimm. 67-74 [doi]
- Subtle gaze manipulation for improved mammography trainingSrinivas Sridharan, Ann McNamara, Cindy Grimm. 75-82 [doi]
- What do you want to do next: a novel approach for intent prediction in gaze-based interactionRoman Bednarik, Hana Vrzakova, Michal Hradis. 83-90 [doi]
- Gaze guided object recognition using a head-mounted eye trackerTakumi Toyama, Thomas Kieninger, Faisal Shafait, Andreas Dengel. 91-98 [doi]
- Audio description as an aural guide of children's visual attention: evidence from an eye-tracking studyIzabela Krejtz, Agnieszka Szarkowska, Krzysztof Krejtz, Agnieszka Walczak, Andrew T. Duchowski. 99-106 [doi]
- Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deckNadir Weibel, Adam Fouse, Colleen Emmenegger, Sara Kimmich, Edwin Hutchins. 107-114 [doi]
- Multi-mode saliency dynamics model for analyzing gaze and attentionRyo Yonetani, Hiroaki Kawashima, Takashi Matsuyama. 115-122 [doi]
- A robust realtime reading-skimming classifierRalf Biedert, Jörn Hees, Andreas Dengel, Georg Buscher. 123-130 [doi]
- Designing gaze-based user interfaces for steering in virtual environmentsSophie Stellmach, Raimund Dachselt. 131-138 [doi]
- Eye-based head gesturesDiako Mardanbegi, Dan Witzner Hansen, Thomas Pederson. 139-146 [doi]
- Simple gaze gestures and the closure of the eyes as an interaction techniqueHenna Heikkilä, Kari-Jouko Räihä. 147-154 [doi]
- Self-localization using fixations as landmarksLisa M. Tiberio, Roxanne L. Canosa. 155-160 [doi]
- Measuring cognitive workload across different eye tracking hardware platformsMichael Bartels, Sandra P. Marshall. 161-164 [doi]
- Parallel scan-path visualizationMichael Raschke, Xuemei Chen, Thomas Ertl. 165-168 [doi]
- Permutation test for groups of scanpaths using normalized Levenshtein distances and application in NMR questionsHui Tang, Joseph J. Topczewski, Anna M. Topczewski, Norbert J. Pienta. 169-172 [doi]
- Robust real-time pupil tracking in highly off-axis imagesLech Swirski, Andreas Bulling, Neil A. Dodgson. 173-176 [doi]
- Detection of smooth pursuits using eye movement shape featuresMélodie Vidal, Andreas Bulling, Hans Gellersen. 177-180 [doi]
- Parsing visual stimuli into temporal units through eye movementsCarlo Robino, Sofia Crespi, Ottavia Silva, Claudio de'Sperati. 181-184 [doi]
- Methodological triangulation to assess sign placementSimon J. Büchner, Jan Malte Wiener, Christoph Hölscher. 185-188 [doi]
- Goal-driven and bottom-up gaze in an active real-world search taskTom Foulsham, Alan Kingstone. 189-192 [doi]
- Using ScanMatch scores to understand differences in eye movements between correct and incorrect solvers on physics problemsAdrian Madsen, Adam M. Larson, Lester C. Loschky, N. Sanjay Rebello. 193-196 [doi]
- Visual attention patterns during program debugging with an IDEPrateek Hejmady, N. Hari Narayanan. 197-200 [doi]
- Towards robust gaze-based objective quality measures for textRalf Biedert, Andreas Dengel, Mostafa Elshamy, Georg Buscher. 201-204 [doi]
- Error characterization and compensation in eye tracking systemsJuan J. Cerrolaza, Arantxa Villanueva, Maria Villanueva, Rafael Cabeza. 205-208 [doi]
- Shifts in reported gaze position due to changes in pupil size: ground truth and compensationJan Drewes, Guillaume S. Masson, Anna Montagnini. 209-212 [doi]
- Automatic acquisition of a 3D eye model for a wearable first-person vision deviceAkihiro Tsukada, Takeo Kanade. 213-216 [doi]
- Evaluation of pupil center-eye corner vector for gaze estimation using a web camLaura Sesma, Arantxa Villanueva, Rafael Cabeza. 217-220 [doi]
- Ego-motion compensation improves fixation detection in wearable eye trackingThomas B. Kinsman, Karen Evans, Glenn Sweeney, Tommy Keane, Jeff B. Pelz. 221-224 [doi]
- Gaze input for mobile devices by dwell and gesturesMorten Lund Dybdal, Javier San Agustin, John Paulin Hansen. 225-228 [doi]
- Gaze gestures or dwell-based interaction?Aulikki Hyrskykari, Howell O. Istance, Stephen Vickers. 229-232 [doi]
- The validity of using non-representative users in gaze communication researchHowell O. Istance, Stephen Vickers, Aulikki Hyrskykari. 233-236 [doi]
- Eye typing of Chinese charactersZhen Liang, Qiang Fu, Zheru Chi. 237-240 [doi]
- The potential of dwell-free eye-typing for fast assistive gaze communicationPer Ola Kristensson, Keith Vertanen. 241-244 [doi]
- Analysing the potential of adapting head-mounted eye tracker calibration to a new userBenedict Fehringer, Andreas Bulling, Antonio Krüger. 245-248 [doi]
- Long range eye tracking: bringing eye tracking into the living roomCraig Hennessey, Jacob Fiset. 249-252 [doi]
- A general framework for extension of a tracking range of user-calibration-free remote eye-gaze tracking systemsDmitri Model, Moshe Eizenman. 253-256 [doi]
- Mathematical model for wide range gaze tracking system based on corneal reflections and pupil using stereo camerasTakashi Nagamatsu, Michiya Yamamoto, Ryuichi Sugano, Junzo Kamahara. 257-260 [doi]
- Towards pervasive eye tracking using low-level image featuresYanxia Zhang, Andreas Bulling, Hans Gellersen. 261-264 [doi]
- A GPU-accelerated software eye tracking systemJeffrey B. Mulligan. 265-268 [doi]
- Extending the visual field of a head-mounted eye tracker for pervasive eye-based interactionJayson Turner, Andreas Bulling, Hans Gellersen. 269-272 [doi]
- Gaze tracking in wide area using multiple camera observationsAkira Utsumi, Kotaro Okamoto, Norihiro Hagita, Kazuhiro Takahashi. 273-276 [doi]
- Eye tracking on unmodified common tablets: challenges and solutionsCorey Holland, Oleg V. Komogortsev. 277-280 [doi]
- Comparison of eye movement filters used in HCIOleg Spakov. 281-284 [doi]
- Bayesian online clustering of eye movement dataEnkelejda Tafaj, Gjergji Kasneci, Wolfgang Rosenstiel, Martin Bogdan. 285-288 [doi]
- The precision of eye-trackers: a case for a new measurePieter J. Blignaut, Tanya René Beelders. 289-292 [doi]
- TrackStick: a data quality measuring tool for Tobii eye trackersPieter J. Blignaut, Tanya René Beelders. 293-296 [doi]
- Entropy-based correction of eye tracking data for static scenesSamuel John, Erik Weitnauer, Hendrik Koesling. 297-300 [doi]
- A flexible gaze tracking algorithm evaluation workbenchDetlev Droege, Dietrich Paulus. 301-304 [doi]
- An eye tracking dataset for point of gaze detectionChristopher McMurrough, Vangelis Metsis, Jonathan Rich, Fillia Makedon. 305-308 [doi]
- Measuring gaze overlap on videos between multiple observersGeoffrey Tien, M. Stella Atkins, Bin Zheng. 309-312 [doi]
- Towards location-aware mobile eye trackingPeter Kiefer, Florian Straub, Martin Raubal. 313-316 [doi]
- Identifying parameter values for an I-VT fixation filter suitable for handling data sampled with various sampling frequenciesAnneli Olsen, Ricardo Matos. 317-320 [doi]
- Comparison of eye movement metrics recorded at different sampling ratesAndrew D. Ouzts, Andrew T. Duchowski. 321-324 [doi]
- On the conspicuity of 3-D fiducial markers in 2-D projected environmentsAndrew D. Ouzts, Andrew T. Duchowski, Toni Gomes, Rupert A. Hurley. 325-328 [doi]
- Voice activity detection from gaze in video mediated communicationMichal Hradis, Shahram Eivazi, Roman Bednarik. 329-332 [doi]
- Incorporating visual field characteristics into a saliency mapHideyuki Kubota, Yusuke Sugano, Takahiro Okabe, Yoichi Sato, Akihiro Sugimoto, Kazuo Hiraki. 333-336 [doi]
- Measuring the performance of gaze and speech for text inputTanya René Beelders, Pieter J. Blignaut. 337-340 [doi]
- Typing with eye-gaze and tooth-clicksXiaoyu Zhao, Elias Daniel Guestrin, Dimitry Sayenko, Tyler Simpson, Michel J. A. Gauthier, Milos R. Popovic. 341-344 [doi]
- The effect of clicking by smiling on the accuracy of head-mounted gaze trackingVille Rantanen, Jarmo Verho, Jukka Lekkala, Outi Tuisku, Veikko Surakka, Toni Vanhala. 345-348 [doi]
- Using eye gaze and speech to simulate a pointing deviceTanya René Beelders, Pieter J. Blignaut. 349-352 [doi]
- Dynamic context switching for gaze based interactionAntonio Diaz Tula, Filipe M. S. de Campos, Carlos H. Morimoto. 353-356 [doi]
- Investigating gaze-supported multimodal pan and zoomSophie Stellmach, Raimund Dachselt. 357-360 [doi]
- Universal eye-tracking based text cursor warpingRalf Biedert, Andreas Dengel, Christoph Käding. 361-364 [doi]
- Gaming with gaze and losing with a smileAnders Møller Nielsen, Anders Lerchedahl Petersen, John Paulin Hansen. 365-368 [doi]
- Content based recommender system by using eye gaze dataDaniela Giordano, Isaak Kavasidis, Carmelo Pino, Concetto Spampinato. 369-372 [doi]
- Using eye-tracking data for automatic film comic creationMasahiro Toyoura, Tomoya Sawada, Mamoru Kunihiro, Xiaoyang Mao. 373-376 [doi]
- Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordingsShahram Eivazi, Roman Bednarik, Markku Tukiainen, Mikael von und zu Fraunberg, Ville Leinonen, Juha E. Jääskeläinen. 377-380 [doi]
- An eye-tracking study on the role of scan time in finding source code defectsBonita Sharif, Michael Falcone, Jonathan I. Maletic. 381-384 [doi]
- Reading and estimating gaze on smart phonesRalf Biedert, Andreas Dengel, Georg Buscher, Arman Vartan. 385-388 [doi]
- Revisiting Russo and LeclercPoja Shams, Erik Wästlund, Lars Witell. 389-392 [doi]
- Learning eye movement patterns for characterization of perceptual expertiseRui Li, Jeff B. Pelz, Pengcheng Shi, Cecilia Ovesdotter Alm, Anne R. Haake. 393-396 [doi]
- Visual attention to television programs with a second-screen applicationMichael E. Holmes, Sheree Josephson, Ryan E. Carney. 397-400 [doi]
- Prisoners and chickens: gaze locations indicate bounded rationalityPeter G. Mahon, Roxanne L. Canosa. 401-404 [doi]
- Saccadic delays on targets while watching videosM. Stella Atkins, Xianta Jiang, Geoffrey Tien, Bin Zheng. 405-408 [doi]
- How to measure monitoring performance of pilots and air traffic controllersCatrin Hasse, Dietrich Grasshoff, Carmen Bruder. 409-412 [doi]
- Exploring the effects of visual cognitive load and illumination on pupil diameter in driving simulatorsOskar Palinko, Andrew L. Kun. 413-416 [doi]