Activation of ASL signs during sentence reading for deaf readers: evidence from eye-tracking

I tiakina i:
Ngā taipitopito rārangi puna kōrero
I whakaputaina i:Bilingualism vol. 28, no. 1 (Jan 2025), p. 208
Kaituhi matua: Saunders, Emily
Ētahi atu kaituhi: Mirault, Jonathan, Emmorey, Karen
I whakaputaina:
Cambridge University Press
Ngā marau:
Urunga tuihono:Citation/Abstract
Full Text
Full Text - PDF
Ngā Tūtohu: Tāpirihia he Tūtohu
Kāore He Tūtohu, Me noho koe te mea tuatahi ki te tūtohu i tēnei pūkete!
Whakaahuatanga
Whakarāpopotonga:Bilinguals activate both of their languages as they process written words, regardless of modality (spoken or signed); these effects have primarily been documented in single word reading paradigms. We used eye-tracking to determine whether deaf bilingual readers (n = 23) activate American Sign Language (ASL) translations as they read English sentences. Sentences contained a target word and one of the two possible prime words: a related prime which shared phonological parameters (location, handshape or movement) with the target when translated into ASL or an unrelated prime. The results revealed that first fixation durations and gaze durations (early processing measures) were shorter when target words were preceded by ASL-related primes, but prime condition did not impact later processing measures (e.g., regressions). Further, less-skilled readers showed a larger ASL co-activation effect. Together, the results indicate that ASL co-activation impacts early lexical access and can facilitate reading, particularly for less-skilled deaf readers.
ISSN:1366-7289
1469-1841
DOI:10.1017/S1366728924000336
Puna:Arts & Humanities Database