Understanding intuitive gestures in wearable mixed reality environments

dc.contributor.advisor James Lathrop
dc.contributor.author Doty, Karen
dc.contributor.department Department of Computer Science
dc.contributor.department Human Computer Interaction Program
dc.date 2020-06-26T20:08:48.000
dc.date.accessioned 2020-06-30T03:22:59Z
dc.date.available 2020-06-30T03:22:59Z
dc.date.copyright Fri May 01 00:00:00 UTC 2020
dc.date.embargo 2020-06-23
dc.date.issued 2020-01-01
dc.description.abstract <p>Augmented and mixed reality experiences are increasingly accessible due to advances in technology in both professional and daily settings. Technology continues to evolve into multiple different forms, including tablet experiences in the form of augmented reality (AR) and mixed reality (MR) using wearable heads-up displays (HUDs). Currently, standards for best usability practices continue to evolve for MR HUD two-dimensional user interfaces (2D UI) and three-dimensional user interfaces (3D UI). Therefore, research on evolving usability practices will serve as guidance for future development of MR HUD applications.</p> <p>The objective of this dissertation is to understand what gestures users intuitively make to respond to a MR environment while wearing a HUD. The Microsoft HoloLens is a wearable HUD that can be used for MR. The Microsoft HoloLens contains two core gestures that were developed to interact with holographic interfaces in MR. Although current gestures can be learned to generate successful outcomes, this dissertation provides a better understanding of which gestures are intuitive to new users of a MR environment.</p> <p>To understand which gestures are intuitive to users, 74 participants without any experience with MR attempted to make gestures within a wearable MR HUD environment. The results of this study show that previous technology experience can influence gesture choice; however, gesture choice also depends on the goal of the interaction scenario. Results suggest that a greater number of programmed gestures are needed in order to best utilize all tools available in wearable HUDs in MR. Results of this dissertation suggest that five new gestures should be created, with three of these gestures serving to reflect a connection between MR interaction and current gesture-based technology. Additionally, results suggest that two new gestures should be created that reflect a connection between gestures for MR and daily movements in the physical world space.</p>
dc.format.mimetype application/pdf
dc.identifier archive/lib.dr.iastate.edu/etd/18080/
dc.identifier.articleid 9087
dc.identifier.contextkey 18242788
dc.identifier.doi https://doi.org/10.31274/etd-20200624-259
dc.identifier.s3bucket isulib-bepress-aws-west
dc.identifier.submissionpath etd/18080
dc.identifier.uri https://dr.lib.iastate.edu/handle/20.500.12876/32263
dc.language.iso en
dc.source.bitstream archive/lib.dr.iastate.edu/etd/18080/Doty_iastate_0097E_18908.pdf|||Fri Jan 14 21:36:30 UTC 2022
dc.subject.keywords Augmented
dc.subject.keywords Gestures
dc.subject.keywords Holograms
dc.subject.keywords Technology
dc.subject.keywords Virtual
dc.subject.keywords Wearable
dc.title Understanding intuitive gestures in wearable mixed reality environments
dc.type thesis en_US
dc.type.genre thesis en_US
dspace.entity.type Publication
relation.isOrgUnitOfPublication f7be4eb9-d1d0-4081-859b-b15cee251456
relation.isOrgUnitOfPublication 0a2a34a1-a7ba-4f4d-ab3c-a37e52b24720
thesis.degree.discipline Human Computer Interaction
thesis.degree.level thesis
thesis.degree.name Doctor of Philosophy
File
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Doty_iastate_0097E_18908.pdf
Size:
14.05 MB
Format:
Adobe Portable Document Format
Description: