The following publications are possibly variants of this publication:
- Evaluating importance of facial expression in american sign language and pidgin signed english animationsMatt Huenerfauth, Pengfei Lu, Andrew Rosenberg. assets 2011: 99-106 [doi]
- Data-Driven Synthesis and Evaluation of Syntactic Facial Expressions in American Sign Language AnimationHernisa Kacorri. PhD thesis, City University of New York, USA, 2016. [doi]
- Models of linguistic facial expressions for American Sign Language animationHernisa Kacorri. sigaccess, 105:19-23, 2013. [doi]
- Measuring the Perception of Facial Expressions in American Sign Language Animations with Eye TrackingHernisa Kacorri, Allen Harper, Matt Huenerfauth. hci 2014: 553-563 [doi]
- Evaluation of a psycholinguistically motivated timing model for animations of american sign languageMatt Huenerfauth. assets 2008: 129-136 [doi]
- A Linguistically Motivated Model for Speed and Pausing in Animations of American Sign LanguageMatt Huenerfauth. taccess, 2(2), 2009. [doi]
- Modeling and synthesizing spatially inflected verbs for American sign language animationsMatt Huenerfauth, Pengfei Lu. assets 2010: 99-106 [doi]
- Collecting and evaluating the CUNY ASL corpus for research on American Sign Language animationPengfei Lu, Matt Huenerfauth. csl, 28(3):812-831, 2014. [doi]
- Design and Evaluation of a User-Interface for Authoring Sentences of American Sign Language AnimationAbhishek Kannekanti, Sedeeq Al-khazraji, Matt Huenerfauth. hci 2019: 258-267 [doi]