Multimodality Assistive Technology for Users with Dyslexia

dc.contributor.author Gordon, Shira
dc.contributor.department Mechanical Engineering
dc.contributor.majorProfessor James Oliver
dc.date 2019-09-22T17:32:34.000
dc.date.accessioned 2020-06-30T01:32:58Z
dc.date.available 2020-06-30T01:32:58Z
dc.date.copyright Tue Jan 01 00:00:00 UTC 2019
dc.date.issued 2019-01-01
dc.description.abstract <p>To assist dyslexic users with reading and writing, several approaches have been explored to convey text to users using text-to-speech technology (TTS), and to transcribe what the user dictates using speech-to-text technology (STT). The currently available assistive technologies suffer from limitations around compatibility with digital-only formats, and the necessity for speaking out loud to use speech synthesis, and dictation out loud, thus creating social stigmas. If we think beyond singular modal solutions and expand the possibilities to include solutions that are multimodal and multisensory, it opens up the door for creative ways to help people with dyslexia. An alternate approach would provide assistance with reading all text, regardless of format, in a way that is inaudible anyone other than the user, and provide a user the ability to transcribe their thoughts without the need to speak out loud. This work explores a multimodal wearable device for assistance with reading and writing for users with dyslexia, using augmented reality for input, and neuromuscular signals picked up by electrodes and bone conduction for output. This device would recognize text from digital, printed or environmental sources, highlights the copy being read, and reads that text aloud to the user utilizing bone conduction output, so the sound would be audible only to the user. For transcription, electrodes in the device can pick up neuromuscular signals in the jaw and face that are triggered by internal verbalizations, requiring users to only say words “in their head.” This would all take place through a device that is barely discernible from the average pair of glasses, reducing stigma a dyslexic user may experience when needing to read or write in social situations.</p>
dc.format.mimetype PDF
dc.identifier archive/lib.dr.iastate.edu/creativecomponents/181/
dc.identifier.articleid 1300
dc.identifier.contextkey 14362848
dc.identifier.s3bucket isulib-bepress-aws-west
dc.identifier.submissionpath creativecomponents/181
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/16714
dc.source.bitstream archive/lib.dr.iastate.edu/creativecomponents/181/Multimodality_Assistive_Technology_for_Users_with_Dyslexia_Gordon.pdf|||Fri Jan 14 21:36:56 UTC 2022
dc.subject.disciplines Engineering
dc.subject.keywords Multimodal
dc.subject.keywords Dyslexia
dc.subject.keywords Augmented Reality
dc.subject.keywords Mind Reading
dc.subject.keywords Assistive Technology
dc.title Multimodality Assistive Technology for Users with Dyslexia
dc.type article
dc.type.genre creativecomponent
dspace.entity.type Publication
relation.isOrgUnitOfPublication 6d38ab0f-8cc2-4ad3-90b1-67a60c5a6f59
thesis.degree.discipline Human Computer Interaction
thesis.degree.level creativecomponent
File
Original bundle
Now showing 1 - 1 of 1
Name:
Multimodality_Assistive_Technology_for_Users_with_Dyslexia_Gordon.pdf
Size:
10.7 MB
Format:
Adobe Portable Document Format
Description: