Browsing by Subject "visual attention"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Disassociating Sensory, Choice, and Attentional Signals to Understand Feature Based Perception and Learning in Small Populations of Intermediate Visual Cortex(2019-05) Moore, ElisabethPerception is integral to how we interact with our visual environment. How perception changes with experience is a function of learning, while how it occurs on a flexible, immediate time scale in relation to dynamic task demands, is mediated by attention. Both of these cognitive phenomena underpin how we perceive and interact with the world around us. Visual perceptual learning (VPL) is the improvement in the ability to perceive our visual environment, and is essential to how humans and other animals learn to interact with the world. Despite an extensive amount of research into the mechanisms of VPL, the neural mechanisms responsible for perceptual improvements remain controversial. A major challenge has been establishing that a particular physiological correlate of learning is actually responsible for learning, as opposed to merely reflecting changes in the properties or populations that are responsible. To address this issue, we employed a perceptual detection task in which neurons in a specific area, V4, are known to have task related responses on a scale of tens of milliseconds that reliably predict the timing and precision of shape detection. We followed population responses using a chronically implanted electrode array while non-human primates learned to detect shapes degraded by noise. Consistent with previous results that examined single neurons and neuronal ensembles, we found that, after the course of learning, variations in the local field potentials of individual electrodes over the course of tens of milliseconds reliably reflected the presentation of degraded shapes, and also predicted detection decisions made by the animal. Moreover, we found that variations in reliability of shape-related signals predicted the up-down fluctuations in performance seen over the course of learning in each animal. Together, these results demonstrate that population signals in area V4 are largely sufficient to explain the timing and reliability of shape detection and how that detection performance increases as a consequence of training. Endogenous feature-based visual attention involves an improvement in neural representations involving the attended feature that is dependent on immediate task dependent demands. How this happens in a specific population, and whether the involved populations overlap with those mediating perception, is not well understood. Due to previous work in our laboratory finding that feature based attention is targeted to specific, task appropriate neural populations in early visual cortex, we asked whether attention is similarly distributed in a task specific way in V4, how this depends on attentional state, and whether such neurons also signal the readout of the perceptual choice, given that choice signals have consistently been found in this area. We designed a demanding stimulus discrimination task where we directed subjects to attend to a specific feature of the task during high-field fMRI scanning. The stimulus alternated continuously at varying frequencies in low and high level features (spatial frequency and shape, due to their expected sensory activation of V1 and V4, respectively). Voxels were measured at high resolution, sampling 1mm of cortex, from V1 to V4, and the stimulus was presented near perceptual threshold in order to disassociate the stimulus from the choice. We used a linear regression analysis to compare continuous BOLD modulation of individual voxels to regressors modeling the continuous stimulus presentation when a given feature was attended to vs when it was not, and assessed how sensory and attention modulations overlapped with modulations containing a relationship to the ongoing perceptual choice. We found clear sensory attention effects in V4 that were specific to certain populations; however this did not appear to depend on initial sensitivity, and we did not see reliable choice signals or choice signals that overlapped with attention signals. We believe this may be due to the experimental design and recommend future approaches to disassociate sensory, attention, and choice signals in visual cortex.Item The Effects Of Selection History On Visual And Auditory Spatial Attention(2020-05) Addleman, DouglasPast research has demonstrated implicit experience-driven effects on spatial attention in vision and audition. In particular, what and where an observer has attended in the past affects future attentional selection. For instance, attention while searching for an item is biased towards locations which contained recent targets—an effect called inter-trial location priming—as well as towards locations which contain targets more often than other regions over a span of time—an effect called location probability learning. In this dissertation, I present three studies investigating selection history effects and how they differ from the better-understood goal-driven form of attention. The first two studies investigate the relationship between spatial selection history and top-down attention during visual search. Study 1 investigated how attending to spatial locations during a visual search task for letters affected a secondary memory task for scenes presented underneath the search array. Implicit location probability learning and goal-driven attention both affected search performance, but only goal-driven attention affected memory for scenes at attended locations. This suggests that implicitly learned probability learning has task-specific effects on attention, while goal-driven attention has task-general effects. Study 2 showed that, unlike goal-driven attention, implicit location probability learning causes shifts of visuospatial attention only after search stimuli appear, not in anticipation of stimulus onset. Study 3 investigated short-term and long-term auditory selection history effects, finding long-term location probability learning but a striking lack of short-term inter-trial location priming. Taken together, this dissertation provides evidence for differences in the implementation of goal-driven and implicitly learned spatial attention that, while present in both vision and audition, manifest in modality-specific ways.Item Experience-guided Attention and Eye Movements: Coupled or Independent?(2023-05) Chen, ChenDaily activities often occur in familiar environments; locations that were important in the past remain significant in the future, making it crucial for us to fine-tune attention based on learning. Indeed, researchers have demonstrated that people can acquire an implicit spatial preference for locations that frequently contained a search target in the past (location probability learning), resulting in both faster response and more frequent eye movements toward those locations. In this dissertation, I presented three empirical studies that investigated experience-guided attention and its relationship with eye movements. Study 1 showed that location probability-guided attention is independent of goal-driven oculomotor control. Study 2 used eye tracking to restrict the visible regions to a location opposite from the fixation, necessitating saccades away from the attended locations. Participants still acquired an attentional preference toward the high-probability region, suggesting that the spatial alignment between attention and eye gaze is unnecessary for location probability learning. Study 3 examined the role of peripheral vision to learning. I simulated peripheral vision loss using gaze-contingent eye tracking. Results showed that implicit location probability learning was impaired in the absence of peripheral vision. Together, this dissertation provides evidence that experience-guided attention is a high-level process independent of oculomotor movements and is implicitly supported by peripheral vision.