Browsing by Subject "Blind"
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Item Indoor Spatial Updating with Impaired Vision-Human Performance Data for 32 Normally Sighted Subjects, 16 Low Vision Subjects and 16 Blind Subjects(2016-09-21) Legge, Gordon E; Granquist, Christina; Baek, Yihwa; Gage, Rachel; legge@umn.edu; Legge, Gordon ESpatial updating is the ability to keep track of position and orientation while moving through an environment. We asked how normally sighted and visually impaired subjects compare in spatial updating and in estimating room dimensions. Groups of 32 normally sighted, 16 low vision and 16 blind subjects estimated the dimensions of six rectangular rooms. Updating was assessed by guiding the subjects along three-segment paths in the rooms. At the end of each path, they estimated the distance and direction to the starting location, and to a designated target (a bean bag dropped at the first segment of their path). Spatial updating was tested in five conditions ranging from free viewing to full auditory and visual deprivation (see documentation for details). The normal and low-vision groups did not differ in their accuracy for judging room dimensions. Correlations between estimated size and physical size were high. Accuracy of low-vision performance was not correlated with acuity, contrast sensitivity or field status. Accuracy was lower for the blind subjects. The three groups were very similar in spatial-updating performance, and exhibited only weak dependence on the nature of the viewing conditions. Conclusions. People with a wide range of low-vision conditions are able to judge room dimensions as accurately as people with normal vision. Blind subjects have difficulty in judging the dimensions of quiet rooms, but some information is available from echolocation. Vision status has little impact on performance in simple spatial updating; Proprioceptive and vestibular cues are sufficient.Item An Integrated Assistive System to Support Wayfinding and Situation Awareness for People with Vision Impairment(2016-05) Liao, Chen-FuPeople with vision impairment usually use a white cane as their primary tool for wayfinding and obstacle detection. Environmental cues, though not always reliable, are used to support the decision making of the visually impaired at various levels of navigation and situation awareness. Due to differences in spatial perception as compared to sighted people, they often encounter physical as well as information barriers along a trip. In order to improve their mobility, accessibility and level of confidence in using our transportation system, it is important to remove not only the physical barriers but also the information barriers that could potentially impede their mobility and undermine safety. Many assistive systems have been developed in the past for visually impaired users to navigate and find their way. However, most of these systems were not adopted by users mostly due to the inconvenience of using such systems. In this research, we developed a mobile accessible information system that allows people with vision impairment to receive transportation information at key locations where decision making is necessary. A smartphone-based personal assistive system, called MAPS (Mobile Accessible Pedestrian System), was developed to provide intersection geometry and signal timing information, not available from other apps in the market for people with vision impairment. In addition, the MAPS incorporates a geospatial database with Bluetooth beacon information that allows the MAPS to provide navigation assistance, situation awareness, and wayfinding to users even when a GPS solution is not available. The MAPS app communicates with the traffic signal controller through a secured wireless link to obtain real-time Signal Phasing and Timing (SPaT) information, which together then inform visually impaired pedestrians with their current locations and when to cross streets. A self-monitoring infrastructure using a network of Bluetooth Low Energy (BLE) beacons was developed to ensure the information integrity of the network. The key contributions of this dissertation include the development of: • A smartphone-based navigation and decision support system that incorporates intersection geometry and traffic signal information for people with vision impairment, • A simple user’s interface (using a single or double-tap on a smartphone screen) that is easy for the visually impaired to learn and use, • Standardized message elements for an audible work zone bypass routing information system, • A self-monitoring infrastructure using a network of commercial off-the-shelf (COTS) low-cost BLE beacons, (including customized firmware allowing BLE beacons to monitor each other), • A crowdsourcing approach using users’ smartphones to monitor the status of BLE beacons and update messages associated with beacons, • A cloud-based geospatial database to support navigation by incorporating BLE beacon localization information when a GPS solution is not available, • A Singular Value Decomposition (SVD) based Multivariable Regression (MR) algorithm together with an Extended Kalman Filter (EKF) technique using beacon localization to provide a positioning solution by the smartphone even if a GPS solution is unavailable, and • Statistical methodologies and wireless signal fingerprinting techniques to monitor BLE beacons in a network in order to determine when a beacon is moved, removed or disappears. The intent of the MAPS is not to undermine the maintenance of skills and strategies that people with vision impairment have learned for navigation and wayfinding. Instead, the system aims to support their wayfinding capability, extend mobility and accessibility, and improve safety for the blind and visually impaired. This self-monitoring infrastructure ensures that correct information is provided to users at the right location when needed. This thesis also introduces the idea of using the same system to warn sighted pedestrians about approaching an intersection when they are distracted by looking at their smartphone.Item Tactile Acuity of Young and Old Pianists [Minnesota Lab for Low-Vision Research, 2019](2019-04-16) Legge, Gordon E; Granquist, Christina; Lubet, Alex; Gage, Rachel; Xiong, Ying-Zi; yingzi@umn.edu; Xiong, Ying-Zi; The Minnesota Laboratory for Low-Vision ResearchA previous study from our lab demonstrated retention of high tactile acuity throughout the lifespan in blind subjects in contrast to the typical decline found for sighted subjects (Legge, Madison, Vaughn, Cheong & Miller, 2008). We hypothesize that preserved tactile acuity in old age is due to lifelong experience with focused attention to touch and not to blindness per se. Proficient pianists devote attention to touch--fingerings and dynamics--over years of practice. To test our hypothesis, we measured tactile acuity in groups of young and old normally sighted pianists and compared their results to the blind and sighted subjects in our 2008 study. The pianists, like the subjects in 2008, were tested on two tactile-acuity charts requiring active touch, one composed of embossed Landolt rings and the other composed of dot patterns similar to braille. For both tests, the pianists performed more like the blind subjects than the sighted subjects from our 2008 study. Our results are consistent with the hypothesis that lifelong experience with focused attention to touch acts to preserve tactile acuity into old age for both blind and sighted subjects. We now release the data we collected in the 2008 study and the new study for sharing and replication purposes.