Browsing by Subject "eye tracking"
Now showing 1 - 4 of 4
- Results Per Page
- Sort Options
Item Associations Between Maternal Anxiety, Infant Attention and Amygdala Development in the First Three Years of Life(2024-07) Lasch, CarolynThe overarching aim of this dissertation was to better understand how socially salient information biases rapid visual orienting in infants and toddlers, and if this bias is associated with other early correlates of later anxiety diagnoses, including infant temperament, maternal anxiety, and amygdala volume. This study represents an important contribution to the literature focused on early development of attention biases and their associations with later anxiety, especially as it incorporates measures of neurodevelopment, caregiver psychopathology, and infant temperament to better understand overlapping and interacting risk factors for later-emerging anxiety. Both aims primarily utilize a sample of data from the Baby Connectome Project, taking advantage of accelerated longitudinal sampling to characterize attention orienting and amygdala development in the first three years of life. Aim 1 decomposed traditional measures of attention orienting bias into two separate measures (attention facilitation and orienting cost) to more precisely examine how socially salient stimuli such as fearful faces relate to vigilance and orienting in early development, and how visual competition may moderate these biases. Associations between early correlates of anxiety (infant temperament and maternal anxiety) and orienting biases were also examined. Findings indicated some unexpected associations, mainly reduced biases in infants with more anxious temperaments, and in infants with higher-anxiety mothers. Findings also highlighted the degree to which stimuli in “competition with” (i.e., co-presented with) fearful stimuli can moderate orienting biases for infants with and without early risk factors for attention biases and anxiety. Aim 2 examined if and how attention orienting biases (attention facilitation and orienting cost), infant temperament, and maternal anxiety were reflected in right amygdala volume development in the first three years of life. Findings revealed that infant mean reaction time (but not attention biases scores) was associated with right amygdala volumes, such that infants with slower reaction times showed more rapid amygdala volume growth. Additionally, infant and toddler temperament were associated with larger right amygdala volumes over the 0- to 3-year period. These findings highlight very early associations between temperamental risk factors for later anxiety and altered neurodevelopment in regions associated with later anxiety. Taken together, these findings suggest that early risk factors for anxiety (especially infant temperament) have early-emerging associations with biased attention orienting and atypical neurodevelopment. Future studies are needed to extend these findings by examining possible complimentary effects in the right visual hemifield, investigating attention orienting biases in later developmental periods, and further elucidating possible associations between infant attention orienting biases and amygdala resting-state functional connectivity.Item Data from: Head and eye movements of normal hearing and hearing impaired participants during three-party conversations(2021-03-04) Lu, Hao; McKinney, Martin F; Zhang, Tao; Oxenham, Andrew J; luxx0489@umn.edu; Lu, Hao; University of Minnesota Auditory Perception and Cognition LaboratoryThe data includes head movement, eye gaze movement and speech segment recorded from 10 young normal-hearing listeners, 10 older normal-hearing listeners and 10 older hearing-impaired listeners during a 3-party group conversation. Each conversation lasts about 25 minutes while different level of background noise were played including no noise, 50 dB, 60 dB and 70 dB SPL noise. The data was released as the supplemental material of the paper [insert paper info] and other researchers working on gaze-guided hearing aids can test their model on it.Item Head Mounted Eye Tracking Aid for Central Visual Field Loss(2016-07) Gupta, AnshulAge-Related Macular Degeneration results in central visual field loss (CFL) due to formation of central blind-spots or scotomas. Activities like reading are affected. We hypothesize that real-time remapping of lost information due to CFL onto a functional portion of the retina will improve visual performance. We have developed two hardware prototypes using a head-mounted display, integrated eye-tracker, and computer to remap and display images in real-time to the wearer. To test, in three different studies, normally-sighted subjects were asked to wear the head-mounted display with the built-in eye tracker. CFL was simulated by placing artificial circular scotomas ranging from 2° to 16° diameter over the gaze position, and reading speed was measured for the remapped and unremapped condition. We observed a statistically significant increase in mean reading speeds for the larger scotomas. Results indicate that the device shows promise for use with CFL patients.Item Modeling the Human Visuo-Motor System for Remote-Control Operation(2018-05) Andersh, JonathanSuccessful operation of a teleoperated miniature rotorcraft relies on capabilities including guidance, trajectory following, feedback control, and environmental perception. For many operating scenarios fragile automation systems are unable to provide adequate performance. In contrast, human-in-the-loop systems demonstrate an ability to adapt to changing and complex environments, stability in control response, high level goal selection and planning, and the ability to perceive and process large amounts of information. Modeling the perceptual processes of the human operator provides the foundation necessary for a systems based approach to the design of control and display systems used by remotely operated vehicles. In this work we consider flight tasks for remotely controlled miniature rotorcraft operating in indoor environments. Operation of agile robotic systems in three dimensional spaces requires a detailed understanding of the perceptual aspects of the problem as well as knowledge of the task and models of the operator response. When modeling the human-in-the-loop the dynamics of the vehicle, environment, and human perception-action are tightly coupled in space and time. The dynamic response of the overall system emerges from the interplay of perception and action. The main questions to be answered in this work are: i) what approach does the human operator implement when generating a control and guidance response? ii) how is information about the vehicle and environment extracted by the human? iii) can the gaze patterns of the pilot be decoded to provide information for estimation and control? In relation to existing research this work differs by focusing on fast acting dynamic systems in multiple dimensions and investigating how the gaze can be exploited to provide action-relevant information. To study human-in-the-loop systems the development and integration of the experimental infrastructure is described. Utilizing the infrastructure, a theoretical framework for computational modeling of the human pilot’s perception-action is proposed and verified experimentally. The benefits of the human visuo-motor model are demonstrated through application examples where the perceptual and control functions of a teleoperation system are augmented to reduce workload and provide a more natural human-machine interface.