Between Dec 19, 2024 and Jan 2, 2025, datasets can be submitted to DRUM but will not be processed until after the break. Staff will not be available to answer email during this period, and will not be able to provide DOIs until after Jan 2. If you are in need of a DOI during this period, consider Dryad or OpenICPSR. Submission responses to the UDC may also be delayed during this time.
 

Using LED Gaze Cues to Enhance Underwater Human-Robot Interaction

Loading...
Thumbnail Image

Persistent link to this item

Statistics
View Statistics

Journal Title

Journal ISSN

Volume Title

Title

Using LED Gaze Cues to Enhance Underwater Human-Robot Interaction

Published Date

2022-05

Publisher

Type

Presentation

Abstract

In the underwater domain, conventional methods of communication between divers and Autonomous Underwater Vehicles (AUVs) are heavily impeded. Radio signal attenuation, water turbidity (cloudiness), and low light levels make it difficult for a diver and AUV to relay information between each other. Current solutions such as underwater tablets, slates, and tags are not intuitive and introduce additional logistical challenges and points of failure. Intuitive human-robot interaction (HRI) is imperative to ensuring seamless collaboration between AUVs and divers. Eye gazes are a natural form of relaying information between humans, and are an underutilized channel of communication in AUVs, while lights help eliminate concerns of darkness, turbidity, and signal attenuation which often impair diver-robot collaboration. This research aims to implement eye gazes on LoCO (a low-cost AUV) using RGB LED rings in order to pursue intuitive forms of HRI underwater while overcoming common barriers to communication. To test the intuitiveness of the design, 9 participants with no prior knowledge of LoCO and HRI were tasked with recalling the meanings for each of 16 gaze indicators during pool trials, while being exposed to the indicators 3 to 4 days earlier. Compared to the baseline text display communication, which had a recall of 100%, the recall for most eye gaze animations were exceptionally high, with an 80% accuracy score for 11 of the 16 indicators. These results suggest certain eye indicators convey information more intuitively than others, and additional training can make gaze indicators a viable method of communication between humans and robots.

Description

Related to

Replaces

License

Series/Report Number

Funding information

This research was supported by the University of Minnesota Undergraduate Research Opportunities Program (UROP). Special word of thanks to the Interactive Robotics and Vision Laboratory (IRV), whose guidance and expertise in HRI made this UROP possible. For more information on the IRV Lab and the LoCO AUV, visit https://irvlab.cs.umn.edu/

Isbn identifier

Doi identifier

Previously Published Citation

Other identifiers

Suggested citation

Prabhu, Aditya; Fulton, Michael; Sattar, Junaed, Ph.D.. (2022). Using LED Gaze Cues to Enhance Underwater Human-Robot Interaction. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/227303.

Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.