Abstract is missing.
- An eye on input: research challenges in using the eye for computer input controlI. Scott MacKenzie. 11-12 [doi]
- Homography normalization for robust gaze estimation in uncalibrated setupsDan Witzner Hansen, Javier San Agustin, Arantxa Villanueva. 13-20 [doi]
- Head-mounted eye-tracking of infants natural interactions: a new methodJohn M. Franchak, Kari S. Kretch, Kasey C. Soska, Jason S. Babcock, Karen E. Adolph. 21-27 [doi]
- User-calibration-free remote gaze estimation systemDmitri Model, Moshe Eizenman. 29-36 [doi]
- Eye movement as an interaction mechanism for relevance feedback in a content-based image retrieval systemYun Zhang, Hong Fu, Zhen Liang, Zheru Chi, David Dagan Feng. 37-40 [doi]
- Content-based image retrieval using a combination of visual features and eye tracking dataZhen Liang, Hong Fu, Yun Zhang, Zheru Chi, David Dagan Feng. 41-44 [doi]
- Have you seen any of these men?: looking at whether eyewitnesses use scanpaths to recognize suspects in photo lineupsSheree Josephson, Michael E. Holmes. 49-52 [doi]
- Estimation of viewer s response for contextual understanding of tasks using features of eye-movementsMinoru Nakayama, Yuko Hayashi. 53-56 [doi]
- Biometric identification via an oculomotor plant mathematical modelOleg V. Komogortsev, Sampath Jayarathna, Cecilia R. Aragon, Mechehoul Mahmoud. 57-60 [doi]
- Saliency-based decision supportRoxanne L. Canosa. 61-63 [doi]
- Qualitative and quantitative scoring and evaluation of the eye movement classification algorithmsOleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, Sandeep A. Munikrishne Gowda. 65-68 [doi]
- An interactive interface for remote administration of clinical tests based on eye trackingAlberto Faro, Daniela Giordano, Concetto Spampinato, D. De Tommaso, S. Ullo. 69-72 [doi]
- Visual attention for implicit relevance feedback in a content based image retrievalAlberto Faro, Daniela Giordano, C. Pino, Concetto Spampinato. 73-76 [doi]
- Evaluation of a low-cost open-source gaze trackerJavier San Agustin, Henrik H. T. Skovsgaard, Emilie Møllenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, John Paulin Hansen. 77-80 [doi]
- An open source eye-gaze interface: expanding the adoption of eye-gaze in everyday applicationsCraig Hennessey, Andrew T. Duchowski. 81-84 [doi]
- Using eye tracking to investigate important cues for representative creature motionMeredith McLendon, Ann McNamara, Tim McLaughlin, Ravindra Dwivedi. 85-88 [doi]
- Eye and pointer coordination in search and selection tasksHans-Joachim Bieg, Lewis L. Chuang, Roland W. Fleming, Harald Reiterer, Heinrich H. Bülthoff. 89-92 [doi]
- Pies with EYEs: the limits of hierarchical pie menus in gaze controlMario H. Urbina, Maike Lorenz, Anke Huckauf. 93-96 [doi]
- Measuring vergence over stereoscopic video with a remote eye trackerBrian Daugherty, Andrew T. Duchowski, Donald H. House, Celambarasan Ramasamy. 97-100 [doi]
- Group-wise similarity and classification of aggregate scanpathsThomas Grindinger, Andrew T. Duchowski, Michael W. Sawyer. 101-104 [doi]
- Inferring object relevance from gaze in dynamic scenesMelih Kandemir, Veli-Matti Saarinen, Samuel Kaski. 105-108 [doi]
- Advanced gaze visualizations for three-dimensional virtual environmentsSophie Stellmach, Lennart Nacke, Raimund Dachselt. 109-112 [doi]
- The use of eye tracking for PC energy managementVasily G. Moshnyaga. 113-116 [doi]
- Low-latency combined eye and head tracking system for teleoperating a robotic head in real-timeStefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, Erich Schneider. 117-120 [doi]
- Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detectionTobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, Hendrik Koesling. 121-124 [doi]
- Visual span and other parameters for the generation of heatmapsPieter J. Blignaut. 125-128 [doi]
- Robust optical eye detection during head movementJeffrey B. Mulligan, Kevin Gabayan. 129-132 [doi]
- What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilitiesErik Wästlund, Kay Sponseller, Ola Pettersson. 133-136 [doi]
- A depth compensation method for cross-ratio based eye trackingFlavio Luiz Coutinho, Carlos Hitoshi Morimoto. 137-140 [doi]
- Estimating cognitive load using remote eye tracking in a driving simulatorOskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter A. Heeman. 141-144 [doi]
- Small-target selection with gaze aloneHenrik H. T. Skovsgaard, Julio C. Mateo, John M. Flach, John Paulin Hansen. 145-148 [doi]
- Measuring situation awareness of surgeons in laparoscopic trainingGeoffrey Tien, M. Stella Atkins, Bin Zheng, Colin Swindells. 149-152 [doi]
- Quantification of aesthetic viewing using eye-tracking technology: the influence of previous training in apparel designJuyeon Park, Emily Woods, Marilyn DeLong. 153-155 [doi]
- Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movementsKentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, Tsukasa Ogasawara. 157-160 [doi]
- Natural scene statistics at stereo fixationsYang Liu, Lawrence K. Cormack, Alan C. Bovik. 161-164 [doi]
- Development of eye-tracking pen display based on stereo bright pupil techniqueMichiya Yamamoto, Takashi Nagamatsu, Tomio Watanabe. 165-168 [doi]
- Pupil center detection in low resolution imagesDetlev Droege, Dietrich Paulus. 169-172 [doi]
- Using vision and voice to create a multimodal interface for Microsoft Word 2007Tanya René Beelders, Pieter J. Blignaut. 173-176 [doi]
- Single gaze gesturesEmilie Møllenbach, Martin Lillholm, Alastair G. Gail, John Paulin Hansen. 177-180 [doi]
- Learning relevant eye movement feature spaces across usersZakria Hussain, Kitsuchart Pasupa, John Shawe-Taylor. 181-185 [doi]
- Towards task-independent person authentication using eye movement signalsTomi Kinnunen, Filip Sedlak, Roman Bednarik. 187-190 [doi]
- Gaze-based web search: the impact of interface design on search result selectionYvonne Kammerer, Wolfgang Beinhauer. 191-194 [doi]
- Eye tracking with the adaptive optics scanning laser ophthalmoscopeScott B. Stevenson, Austin Roorda, Girish Kumar. 195-198 [doi]
- Listing s and Donders laws and the estimation of the point-of-gazeElias Daniel Guestrin, Moshe Eizenman. 199-202 [doi]
- Visual scanpath representationJoseph H. Goldberg, Jonathan Helfman. 203-210 [doi]
- A vector-based, multidimensional scanpath similarity measureHalszka Jarodzka, Kenneth Holmqvist, Marcus Nyström. 211-218 [doi]
- Scanpath comparison revisitedAndrew T. Duchowski, Jason Driver, Sheriff Jolaoso, William Tan, Beverly N. Ramey, Ami Robbins. 219-226 [doi]
- Scanpath clustering and aggregationJoseph H. Goldberg, Jonathan Helfman. 227-234 [doi]
- Match-moving for area-based analysis of eye movements in natural tasksWayne J. Ryan, Andrew T. Duchowski, Ellen A. Vincent, Dina Battisto. 235-242 [doi]
- Interpretation of geometric shapes: an eye movement studyMiquel Prats, Steve Garner, Iestyn Jowers, Alison McKay, Nieves Pedreira. 243-250 [doi]
- User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyesTakashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka. 251-254 [doi]
- Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eyeTakashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto. 255-258 [doi]
- The pupillometric precision of a remote video eye trackerJeff Klingner. 259-262 [doi]
- Contingency evaluation of gaze-contingent displays for real-time visual field simulationsMargarita Vinnikov, Robert S. Allison. 263-266 [doi]
- SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze dataDaniel F. Pontillo, Thomas B. Kinsman, Jeff B. Pelz. 267-270 [doi]
- Context switching for fast key selection in text entry applicationsCarlos Hitoshi Morimoto, Arnon Amir. 271-274 [doi]
- Fixation-aligned pupillary response averagingJeff Klingner. 275-282 [doi]
- Understanding the benefits of gaze enhanced visual searchPernilla Qvarfordt, Jacob T. Biehl, Gene Golovchinsky, Tony Dunnigan. 283-290 [doi]
- Image ranking with implicit feedback from eye movementsDavid R. Hardoon, Kitsuchart Pasupa. 291-298 [doi]
- How the interface design influences users spontaneous trustworthiness evaluations of web search results: comparing a list and a grid interfaceYvonne Kammerer, Peter Gerjets. 299-306 [doi]
- Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learningMichael Dorr, Halszka Jarodzka, Erhardt Barth. 307-314 [doi]
- Alternatives to single character entry and dwell time selection on eye typingMario H. Urbina, Anke Huckauf. 315-322 [doi]
- Designing gaze gestures for gaming: an investigation of performanceHowell O. Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, Stephen Vickers. 323-330 [doi]
- ::::ceCursor::::, a contextual eye cursor for general pointing in windows environmentsMarco Porta, Alice Ravarelli, Giovanni Spagnoli. 331-337 [doi]
- BlinkWrite2: an improved text entry method using eye blinksBehrooz Ashtiani, I. Scott MacKenzie. 339-345 [doi]