Abstract is missing.
- Theoretical foundations of multimodal interfaces and systemsSharon L. Oviatt. 19-50 [doi]
- The impact of multimodal-multisensory learning on human performance and brain activation patternsKarin H. James, Sophia Vinci-Booher, Felipe Munoz-Rubke. 51-94 [doi]
- Multisensory haptic interactions: understanding the sense and designing for itKaron E. MacLean, Oliver S. Schneider, Hasti Seifi. 97-142 [doi]
- A background perspective on touch as a multimodal (and multisensor) constructKen Hinckley. 143-199 [doi]
- Understanding and supporting modality choicesAnthony Jameson, Per Ola Kristensson. 201-238 [doi]
- Using cognitive models to understand multimodal processes: the case for speech and gesture productionStefan Kopp, Kirsten Bergmann. 239-276 [doi]
- Multimodal feedback in HCI: haptics, non-speech audio, and their applicationsEuan Freeman, Graham A. Wilson, Dong-Bach Vo, Alexander Ng, Ioannis Politis, Stephen A. Brewster. 277-317 [doi]
- Multimodal technologies for seniors: challenges and opportunitiesCosmin Munteanu, Albert Ali Salah. 319-362 [doi]
- Gaze-informed multimodal interactionPernilla Qvarfordt. 365-402 [doi]
- Multimodal speech and pen interfacesPhilip R. Cohen, Sharon L. Oviatt. 403-447 [doi]
- Multimodal gesture recognitionAthanasios Katsamanis, Vassilis Pitsikalis, Stavros Theodorakis, Petros Maragos. 449-487 [doi]
- Audio and visual modality combination in speech processing applicationsGerasimos Potamianos, Etienne Marcheret, Youssef Mroueh, Vaibhava Goel, Alexandros Koumbaroulis, Argyrios Vartholomaios, Spyridon Thermos. 489-543 [doi]
- Perspectives on learning with multimodal technologyKarin Harman James, James C. Lester, Dan Schwartz, Katherine M. Cheng, Sharon L. Oviatt. 547-570 [doi]