Abstract is missing.
- On the ease and efficiency of human-computer interfacesShumin Zhai. 9-10 [doi]
- Longitudinal evaluation of discrete consecutive gaze gestures for text entryJacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, Andrew T. Duchowski. 11-18 [doi]
- Now Dasher! Dash away!: longitudinal study of fast text entry by Eye GazeOuti Tuisku, Päivi Majaranta, Poika Isokoski, Kari-Jouko Räihä. 19-26 [doi]
- ::::Eye::::-S: a full-screen input modality for pure eye-based communicationMarco Porta, Matteo Turina. 27-34 [doi]
- Measurement of eye velocity using active illuminationJeffrey B. Mulligan. 35-38 [doi]
- A method to study visual attention aspects of collaboration: eye-tracking pair programmers simultaneouslySami Pietinen, Roman Bednarik, Tatiana Glotova, Vesa Tenhunen, Markku Tukiainen. 39-42 [doi]
- Testing for statistically significant differences between groups of scan patternsMatthew K. Feusner, Brian Lukoff. 43-46 [doi]
- Improving hands-free menu selection using eyegaze glances and fixationsGeoffrey Tien, M. Stella Atkins. 47-50 [doi]
- Gazing with pEYEs: towards a universal input for various applicationsAnke Huckauf, Mario H. Urbina. 51-54 [doi]
- Eye typing using word and letter prediction and a fixation algorithmI. Scott MacKenzie, Xuang Zhang. 55-58 [doi]
- 3D point-of-gaze estimation on a volumetric displayCraig Hennessey, Peter D. Lawrence. 59 [doi]
- Limbus/pupil switching for wearable eye tracking under variable lighting conditionsWayne J. Ryan, Andrew T. Duchowski, Stanley T. Birchfield. 61-64 [doi]
- Improving the accuracy of gaze input for interactionManu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, Andreas Paepcke. 65-68 [doi]
- Measuring the task-evoked pupillary response with a remote eye trackerJeff Klingner, Rakshit Kumar, Pat Hanrahan. 69-72 [doi]
- Estimation of certainty for multiple choice tasks using features of eye-movementsMinoru Nakayama, Yosiyuki Takahasi. 73-76 [doi]
- Assessing usability with eye-movement frequency analysisMinoru Nakayama, Makoto Katsukura. 77 [doi]
- Integrated speech and gaze control for realistic desktop environmentsEmiliano Castellina, Fulvio Corno, Paolo Pellegrino. 79-82 [doi]
- Comparing behavioural and self-report measures of engagement with an embodied conversational agent: a first report on eye tracking in Second LifeSara Dalzel-Job, Craig Nicol, Jon Oberlander. 83-85 [doi]
- A head-mounted sensor-based eye tracking device: eye touch systemCihan Topal, Ömer Nezih Gerek, Atakan Dogan. 87-90 [doi]
- Evaluating requirements for gaze-based interaction in a see-through head mounted displaySven-Thomas Graupner, Michael Heubner, Sebastian Pannasch, Boris Velichkovsky. 91-94 [doi]
- One-point calibration gaze tracking based on eyeball kinematics using stereo camerasTakashi Nagamatsu, Junzo Kamahara, Takumi Iko, Naoki Tanaka. 95-98 [doi]
- Temporal eye-tracking data: evolution of debugging strategies with multiple representationsRoman Bednarik, Markku Tukiainen. 99-102 [doi]
- EyeSecret: an inexpensive but high performance auto-calibration eye trackerZhang Yun, Zhao Xin-Bo, Zhao Rong-Chun, Zhou Yuan, Zou Xiao-Chun. 103-106 [doi]
- KiEV: a tool for visualization of reading and writing processes in translation of textOleg Spakov, Kari-Jouko Räihä. 107-110 [doi]
- The incomplete fixation measureFrederick Shic, Brian Scassellati, Katarzyna Chawarska. 111-114 [doi]
- Voluntary pupil size change as control in eyes only interactionInger Ekman, Antti Poikola, Meeri Mäkäräinen, Tapio Takala, Perttu Hämäläinen. 115-118 [doi]
- Contact-analog information representation in an automotive head-up displayTony Poitschke, Markus Ablaßmeier, Gerhard Rigoll, Stanislavs Bardins, Stefan Kohlbecher, Erich Schneider. 119-122 [doi]
- Effects of time pressure and text complexity on translators fixationsSelina Sharmin, Oleg Spakov, Kari-Jouko Räihä, Arnt Lykke Jakobsen. 123-126 [doi]
- Real-time simulation of visual defects with gaze-contingent displayMargarita Vinnikov, Robert S. Allison, Dominik Swierad. 127-130 [doi]
- Comparison of eye movements in searching for easy-to-find and hard-to-find information in a hierarchically organized information structureYoshiko Habuchi, Muneo Kitajima, Haruhiko Takeuchi. 131-134 [doi]
- Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D spaceStefan Kohlbecher, Stanislavs Bardins, Klaus Bartl, Erich Schneider, Tony Poitschke, Markus Ablaßmeier. 135-138 [doi]
- Spatialchromatic foveation for gaze contingent displaysSheng Liu, Hong Hua. 139-142 [doi]
- Using semantic content as cues for better scanpath predictionMoran Cerf, E. Paxon Frady, Christof Koch. 143-146 [doi]
- Eye2i: coordinated multiple views for gaze dataHarri Rantala. 147 [doi]
- GInX: gaze based interface extensionsThiago S. Barcelos, Carlos Hitoshi Morimoto. 149-152 [doi]
- An online noise filter for eye-tracker data recorded in a virtual environmentSylvain Chartier, Patrice Renaud. 153-156 [doi]
- Cross-race recognition deficit and visual attention: do they all look (at faces) alike?Sheree Josephson, Michael E. Holmes. 157-164 [doi]
- The visual span of chess playersPieter J. Blignaut, Tanya René Beelders, C.-Y. So. 165-171 [doi]
- Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandingsMauro Cherubini, Marc-Antoine Nüssli, Pierre Dillenbourg. 173-180 [doi]
- 3D point-of-regard, position and head orientation from a portable monocular video-based eye trackerSusan M. Munn, Jeff B. Pelz. 181-188 [doi]
- A robust 3D eye gaze tracking system using noise reductionJixu Chen, Yan Tong, Wayne D. Gray, Qiang Ji. 189-196 [doi]
- A new wireless search-coil systemDale Roberts, Mark Shelhamer, Aaron Wong. 197-204 [doi]
- Noise tolerant selection by gaze-controlled pan and zoom in 3DDan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, Emilie Møllenbach. 205-212 [doi]
- Looking my way through the menu: the impact of menu design and multimodal input on gaze-based menu selectionYvonne Kammerer, Katharina Scheiter, Wolfgang Beinhauer. 213-220 [doi]
- Snap clutch, a moded approach to solving the Midas touch problemHowell O. Istance, Richard Bates, Aulikki Hyrskykari, Stephen Vickers. 221-228 [doi]
- Eye movement prediction by Kalman filter with integrated linear horizontal oculomotor plant mechanical modelOleg Komogortsev, Javed I. Khan. 229-236 [doi]
- Analysis of subject-dependent point-of-gaze estimation bias in the cross-ratios methodElias Daniel Guestrin, Moshe Eizenman, Jeffrey J. Kang, Erez Eizenman. 237-244 [doi]
- Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actionsHirotake Yamazoe, Akira Utsumi, Tomoko Yonezawa, Shinji Abe. 245-250 [doi]
- A software framework for simulating eye trackersMartin Böhme, Michael Dorr, Mathis Graw, Thomas Martinetz, Erhardt Barth. 251-258 [doi]
- Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systemsJuan J. Cerrolaza, Arantxa Villanueva, Rafael Cabeza. 259-266 [doi]
- Remote point-of-gaze estimation requiring a single-point calibration for applications with infantsElias Daniel Guestrin, Moshe Eizenman. 267-274 [doi]