Impact of Motion and Visual Presentation on the Performance of a Vehicle Roll-Tilt Task in a Virtual Reality and Motion Simulator System

Date of Award


Degree Name

M.S. in Mechanical Engineering


Department of Mechanical and Aerospace Engineering


Megan Reissman


Spaceflight and aircraft piloting are both complex, high pressure tasks with complicating factors that might hinder mission success. These aspects must be studied and prepared for, with preventative measures implemented as needed. Of particular interest is human motor control during these spaceflight and piloting situations, which are categorized as human-in-machine systems. Here we define human motor control as the means by which the human body executes a movement task. This research study is focused on improving understanding of human motor control performance during task matching in a virtual reality (VR) environment. Specifically, when these tasks are coupled to a motion simulator enabling whole body matched or mismatched motion of the pilot in roll-tilt maneuvers as well as during hypoxic conditions. Several of these couplings are directly applicable to the types of tasks that astronauts and pilots face. Using an IRB approved study, 15 participants were placed in a motion simulator chair paired with a custom virtual reality environment designed for the visual presentation of motor control roll-tilt tasks. Tasks were completed by participants using a standard joystick, which then changed the tilt angle presented in the virtual environment and sent movement commands to the motion simulator. Trials included tasks matching their visual roll-tilt while experiencing matched motion simulator roll-tilt, visual roll-tilt with no motion simulator roll-tilt, and visual roll-tilt with mismatched motion simulator tilt. Visual roll-tilt matching tasks were presented with both continuous and disappearing targets. Additionally, 5 participants who had previously performed the set of visual matching tasks listed above, performed on a separate day visual roll-tilt matching tasks when exposed to hypoxic (10% O2) and normoxic (21% O2) conditions. All matching visual tasks were paired with matched motion simulator roll-tilt and no motion simulator roll-tilt. Motion simulator roll-tilt had significant effects compared to no simulator roll-tilt motion on final visual tilt error during the primary part of the study, with matched simulator roll-tilt motion producing higher visual error than no simulator motion. During the visual-vestibular mismatch portion of the study, vestibular manipulation was found to significantly impact final visual tilt error. These results suggest that sensory integration of disparate vestibular information with respect to visual information may result in reduced task performance. Within this study, it is postulated that participants tended to bias their weighting of vestibular sensory information over visual information during their visual roll-tilt matching tasks, especially as the roll-tilt angle of the motion platform increased. It should be noted that the participants showed evidence of this even when they were informed that their vestibular input may be wrong with respect to their visual environment. Further work in this area would allow for greater variation in subject population, as well as focus on expanding hypoxic study to provide more certain conclusions.


Aerospace Engineering, Biomechanics, Biomedical Research, Engineering, Kinesiology, Mechanical Engineering, Human Motor Control, Motion Simulator, Virtual Reality, Biomechanics, Vestibular

Rights Statement

Copyright © 2022, author