Browsing by Subject "Autonomous"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Autonomous Navigation On Urban Sidewalks Under Winter Conditions(2020-04) Johnson, ReedWe describe a multi-step approach to facilitate autonomous navigation in snow by small vehicles in urban environments, allowing travel only on sidewalks and paved paths. Our objective is to have a vehicle autonomously navigate from point A on one urban block to point B on another block, crossing from one block to another only at curb-cuts, and stopping when pedestrians get in the way. A small mobile platform is first manually driven along the sidewalks to continuously record LIDAR and Global Navigation Satellite System (GNSS) data when little to no snow is on the ground. Our algorithm automatically post processes the data to generate a labeled traversability map. During this automated process, areas such as grass, sidewalks, stationary obstacles, roads and curb-cuts are identified. By differentiating between these areas using only LIDAR, the vehicle is later able to create a path for travel on only sidewalks or roads and not in other areas. Our localization approach uses an Extended Kalman Filter to fuse the Lightweight and Ground-Optimized LIDAR Odometry and Mapping (LeGO-LOAM) approach with high accuracy GNSS where available, to allow for accurate localization even in areas with poor GNSS, which is often the case in cities and areas covered by tree canopy. This localization approach is used during the data capture stage, prior to the post-processing stage when labeled segmentation is performed, and again during real time autonomous navigation, carried out using the ROS navigation stack. By using LIDAR odometry combined with GNSS, the robot is able to localize under many different weather conditions, including snow and rain, where other algorithms (e.g. AMCL) will likely fail. We were able to successfully have the vehicle autonomously plan and navigate a 1.6km path in an urban snow-covered neighborhood. Our methodology facilitates autonomous navigation functionality under most weather conditions including autonomous wheelchair navigation.Item Using LED Gaze Cues to Enhance Underwater Human-Robot Interaction(2022-05) Prabhu, Aditya; Fulton, Michael; Sattar, Junaed, Ph.D.In the underwater domain, conventional methods of communication between divers and Autonomous Underwater Vehicles (AUVs) are heavily impeded. Radio signal attenuation, water turbidity (cloudiness), and low light levels make it difficult for a diver and AUV to relay information between each other. Current solutions such as underwater tablets, slates, and tags are not intuitive and introduce additional logistical challenges and points of failure. Intuitive human-robot interaction (HRI) is imperative to ensuring seamless collaboration between AUVs and divers. Eye gazes are a natural form of relaying information between humans, and are an underutilized channel of communication in AUVs, while lights help eliminate concerns of darkness, turbidity, and signal attenuation which often impair diver-robot collaboration. This research aims to implement eye gazes on LoCO (a low-cost AUV) using RGB LED rings in order to pursue intuitive forms of HRI underwater while overcoming common barriers to communication. To test the intuitiveness of the design, 9 participants with no prior knowledge of LoCO and HRI were tasked with recalling the meanings for each of 16 gaze indicators during pool trials, while being exposed to the indicators 3 to 4 days earlier. Compared to the baseline text display communication, which had a recall of 100%, the recall for most eye gaze animations were exceptionally high, with an 80% accuracy score for 11 of the 16 indicators. These results suggest certain eye indicators convey information more intuitively than others, and additional training can make gaze indicators a viable method of communication between humans and robots.