Person wearing a VR headset in front of dual computer monitors in an office.
Person using a VR headset interacting with a virtual coffee shop scene featuring an animated character and menu.

Training an AI model on ASL

Developing ASL Champ! Alam et al., 2024.

Scatter plot showing action identification accuracy versus reaction time for Deaf, Hard of Hearing (HoH), and Hearing groups with trend lines for each group.

Identifying behavioral patterns in action identification among people of differing hearing statuses. Willis et al., 2024.

Papers

Zhang, H., Shalev-Arkushin, R., Baltatzis, V., Gillis, C., Laput, G., Kushalnagar, R., Quandt, L. C., Findlater, L., Bedri, K., Lea, C. Towards AI-driven Sign Language Generation with Non-manual Markers. Full paper accepted to CHI ’25. Link

Alam, M. S., Palagano, J., Quandt, L. C. (2024). Insights from immersive learning: Using sentiment analysis and real-time narration to refine ASL instruction in virtual reality. ASSETS ’24: Proceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility, 108, 1-4. Link

 Willis, A. S., Leannah, C., Schwenk, M., Palagano, J., & Quandt, L. C. (2024). Differences in biological motion perception associated with hearing status and age of signed language exposure. Journal of Experimental Psychology: General, 153(10), 2378–2393. Link

Inan, M., Atwell, K., Sicilia, A., Quandt, L. C., & Alikhani, M. (2024). Generating signed language instructions in large-scale dialogue systems. In Proceedings of the 2024 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2024), 6, 140-154. Link

Alam, M. S., Lamberton, J., Wang, J., Leannah, C., Miller, S., Palagano, J., de Bastion, M., Smith, H., Malzkuhn M., & Quandt, L. C. (2024). ASL Champ!: A virtual reality game with deep-learning driven sign recognition. Computers & Education: X Reality, 4. Link

Viegas, C., Inan, M., Quandt, L. C., & Alikhani, M. (2023). Including facial expressions in contextual embeddings for sign language generation. Paper presented at the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023). Link

Alam, M. S., De Bastion, M., Malzkuhn, M., & Quandt, L. C. (2023). Recognizing highly variable American Sign Language in virtual reality. To be presented at the 8th International Workshop on Sign Language Translation and Avatar Technology (SLTAT). Link

May, L., Miller, S., Bakri, S., Quandt, L. C., & Malzkuhn, M. (2023). Designing access in sound art exhibitions: Centering deaf experiences in Musical Thinking. In Proceedings of CHI23: ACM CHI Conference on Human Factors in Computing Systems. Link

Leannah, C., Willis, A. S., & Quandt, L. C. (2022). Perceiving fingerspelling via point-light displays: The stimulus and the perceiver both matter. PLOS ONE 17(8): e0272838. Link

Quandt, L. C., Lamberton, J., Leannah, C., Willis, A., & Malzkuhn, M. (2022). Signing avatars in a new dimension: Challenges and opportunities in virtual reality. In Proceedings of the 7th International Workshop on Sign Language Translation and Avatar Technology (SLTAT). Link

Quandt, L. C., Willis, A. S., Schwenk, M., Weeks, K., & Ferster, R. (2022). Attitudes toward signing human avatars vary depending on hearing status and age of signed language exposure. Frontiers in Psychology. Link

Berteletti, I., Kimbley, S. E., Sullivan, S. J., Quandt, L. C., & Miyakoshi, M. (2022). Different language modalities, yet similar cognitive processes in arithmetic fact retrieval. Brain Sciences, 12(2), 145. Link

Quandt, L. C. & Kubicek, E., Willis, A. S., & Lamberton, J. (2021). Enhanced biological motion perception in deaf native signers. Neuropsychologia. Link

Quandt, L. C. & Willis, A. S. Earlier and more robust sensorimotor discrimination of ASL signs in deaf signers during imitation. (2021). Language, Cognition, & Neuroscience. 10.1080/23273798.2021.1925712. Link

Kubicek, E. & Quandt, L. C. (2021). A positive relationship between sign language comprehension and mental rotation abilities. Journal of Deaf Studies and Deaf Education, 26, 1-12. Link

Shao, Q., Sniffen, A., Blanchett, J., Hillis, M. E., Shi, X., Haris, T. K., Liu, J., Lamberton, J., Malzkuhn, M., Quandt, L. C., Mahoney, J., Kraemer, D. J. M., Zhou, X., & Balkcom, D. (2020). Teaching American Sign Language in mixed reality. In The Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 4, 152. Link

Quandt, L. C., Lamberton, J., Willis, A. S., Wang, J., Weeks, K., Kubicek, E., & Malzkuhn, M. (2020). Teaching ASL signs using signing avatars and immersive learning in virtual reality. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’20), October 26–28, Virtual Event, Greece. Link

Kubicek, E. & Quandt, L. C. (2019). Sensorimotor system engagement during ASL sign perception: an EEG study in deaf signers and hearing non-signers. Cortex, 119, 457-469. Link

Quandt, L. C. & Kubicek, E. (2018). Sensorimotor characteristics of sign translations modulate EEG when deaf signers read English. Brain and Language, 187, 9-17. Link

Earlier work

Quandt, L. C., Lee, Y.-S., & Chatterjee, A. (2017). Neural bases of action abstraction. Biological Psychology, 129, 314-323.

Quandt, L. C., Cardillo, E. R., Kranjec, A., & Chatterjee, A. (2015). Fronto-temporal regions encode the manner of motion in spatial language. Neuroscience Letters, 609, 171-175. 

Quandt, L. C. & Chatterjee, A. (2015). Rethinking actions: Implementation and association. Wiley Interdisciplinary Reviews: Cognitive Science, 6, 483-490. 

Bunlon, F., Marshall, P. J., Quandt, L. C., & Bouquet, C. A. (2015). Influence of action-effect associations acquired by ideomotor learning on imitation. PLOS ONE, 10, e0121617. 

Drew, A. R., Quandt, L. C., & Marshall, P. J. (2015). Visual influences on sensorimotor EEG responses during observation of hand actions. Brain Research, 1597, 119-128. 

Quandt, L. C. & Marshall, P. J. (2014). The effect of action experience on sensorimotor EEG rhythms during action observation. Neuropsychologia, 56, 401-408. 

Quandt, L. C., Marshall, P. J., Bouquet, C. A., & Shipley, T. F. (2013). Somatosensory experiences modulate alpha and beta power during subsequent action observation. Brain Research, 1534, 55-65. 

Quandt, L. C., Marshall, P. J., Shipley, T. F., Beilock, S. L., & Goldin-Meadow, S. (2012). Sensitivity of alpha and beta oscillations to sensorimotor characteristics of action: An EEG study of action production and gesture observation. Neuropsychologia, 50, 2745-51. 

Quandt, L. C., Marshall, P. J., Bouquet, C. A., Young, T., & Shipley, T. F. (2011). Experience with novel actions modulates frontal alpha EEG desynchronization. Neuroscience Letters, 499, 37-41. 

Roussotte, F. F., Bramen, J. E., Nuñez, S. C., Quandt, L. C., Smith, L. M., O’Connor, M. J., Bookheimer, S. Y., & Sowell, E. R. (2011). Abnormal brain activation during working memory in children with prenatal exposure to drugs of abuse: The effects of methamphetamine, alcohol, and polydrug exposure. NeuroImage, 54, 3067-75. 

Carp, J., Halenar, M. J., Quandt, L. C., Sklar, A., & Compton, R. J. (2009). Perceived similarity and neural mirroring: Evidence from vicarious error processing. Social Neuroscience, 4, 85-96. 

Compton, R. J., Carp, J., Chaddock, L., Fineman, S. L., Quandt, L. C., & Ratliff, J. B. (2008). Trouble crossing the bridge: Interhemispheric communication of emotional images in anxiety. Emotion, 8, 684-92. 

Compton, R. J., Robinson, M. D., Ode, S., Quandt, L. C., Fineman, S. L., & Carp, J. (2008). Error monitoring ability predicts daily stress regulation. Psychological Science, 19, 702-8. 

Compton, R. J., Lin, M., Vargas, G., Carp, J., Fineman, S. L., & Quandt, L. C. (2008). Error-detection and post-error behavior in depressed undergraduates. Emotion, 8, 58-67.

Compton, R. J., Carp, J., Chaddock, L., Fineman, S. L., Quandt, L. C., & Ratliff, J. B. (2007). Error-monitoring in anxiety: Increased error-sensitivity or altered expectations? Brain and Cognition, 64, 247-56.