Feit, Andrew, J2018-03-072018-03-072018-03-07https://hdl.handle.net/11299/194388Experiments consisted of simulated first-person guidance tasks. A simulated environment allows for precise control of the available visual cues used by the subject, a precise and repeatable environment configuration, and consistent vehicle dynamic response. During experiment trials, a subject uses a controller to move a vehicle through an environment. The objective of each trial is to move from a specified start position to the goal corridor in minimal time while avoiding obstructions in the environment. While navigating through the scene, a gaze tracking device records the gaze direction of the subject. This data is used to determine which portions of the environment the subject is focusing on during specific phases of the task.To investigate guidance and perception behavior, an experimental system is used to observe this behavior in human subjects. The primary goal of this framework is to present a first-person guidance task to the subject, and observe the action and perception relationships generated by the subject to complete the task. The first-person perspective requires a subject to learn at both the guidance and planning levels. At the planning level to determine feasible routes based on visual cues, and at the guidance level to learn optimal relationships between control actions and visual cue motion. These learning tasks are challenges also faced by autonomous systems with first-person sensors. Understanding how humans perform guidance from this perspective is therefore directly applicable to investigating human-inspired approaches to autonomous guidance.CC0 1.0 Universalhttp://creativecommons.org/publicdomain/zero/1.0/motion guidancehuman perceptionhuman guidanceInvestigation of Human First-Person Motion Guidance and Perception Behavior in a Simulation EnvironmentDatasethttps://doi.org/10.13020/D63D6R