Fusing Self-Reported and Sensor Data from Mixed-Reality Training

Supplemental Files
Date
2014-01-01
Authors
MacAllister, Anastacia
Richardson, Trevor
Gilbert, Stephen
Holub, Joseph
Thompson, Frederick
MacAllister, Anastacia
Radkowski, Rafael
Gilbert, Stephen
Winer, Eliot
Davies, Paul
Terry, Scott
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Research Projects
Organizational Units
Journal Issue
Series
Department
Mechanical EngineeringVirtual Reality Applications CenterElectrical and Computer EngineeringIndustrial and Manufacturing Systems EngineeringHuman Computer InteractionVirtual Reality Applications Center
Abstract

Military and industrial use of smaller, more accurate sensors are allowing increasing amounts of data to be acquired at diminishing costs during training. Traditional human subject testing often collects qualitative data from participants through self-reported questionnaires. This qualitative information is valuable but often incomplete to assess training outcomes. Quantitative information such as motion tracking data, communication frequency, and heart rate can offer the missing pieces in training outcome assessment. The successful fusion and analysis of qualitative and quantitative information sources is necessary for collaborative, mixed-reality, and augmented-reality training to reach its full potential. The challenge is determining a reliable framework combining these multiple types of data. Methods were developed to analyze data acquired during a formal user study assessing the use of augmented reality as a delivery mechanism for digital work instructions. A between-subjects experiment was conducted to analyze the use of a desktop computer, mobile tablet, or mobile tablet with augmented reality as a delivery method of these instructions. Study participants were asked to complete a multi-step technical assembly. Participants’ head position and orientation were tracked using an infrared tracking system. User interaction in the form of interface button presses was recorded and time stamped on each step of the assembly. A trained observer took notes on task performance during the study through a set of camera views that recorded the work area. Finally, participants each completed pre and post-surveys involving self-reported evaluation. The combination of quantitative and qualitative data revealed trends in the data such as the most difficult tasks across each device, which would have been impossible to determine from self-reporting alone. This paper describes the methods developed to fuse the qualitative data with quantified measurements recorded during the study.

Comments

This proceeding is published as Richardson T., Gilbert S., Holub, J., Thompson, F., MacAllister, A., Radkowski, R., Winer, E., Davies, P., Terry, S. (2014) "Fusing Self-Reported and Sensor Data from Mixed-Reality Training", The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC). Paper No. 14158. Posted with permission.

Description
Keywords
Citation
DOI
Source