Gesture: A Cyberphysical system to interpret American Sign Language
2015
Loading...
View/Download File
Persistent link to this item
Statistics
View StatisticsJournal Title
Journal ISSN
Volume Title
Title
Gesture: A Cyberphysical system to interpret American Sign Language
Authors
Published Date
2015
Publisher
Type
Thesis or Dissertation
Abstract
Language obstacles can often exist when hearing impaired individuals interact with others
who are not well-versed in sign language. We propose and develop a system that utilizes
commercially available, wireless armbands that are capable of detecting acceleration, orien-
tation and electromyography (muscle activity) data. Our interpretation system is designed
to translate gestures in a \word-by-word" manner by assuming that each sign language sign
corresponds to a unique English word, which eliminates inter-word dependence and makes
this a classi cation problem. Using convolutional neural networks as our interpretation al-
gorithm, we achieve high classi cation accuracy of roughly 90% for a dictionary of 7 words
as well as a \null" word.
Description
Related to
Replaces
License
Collections
Series/Report Number
Funding information
Isbn identifier
Doi identifier
Previously Published Citation
Other identifiers
Suggested citation
Sievert, Scott. (2015). Gesture: A Cyberphysical system to interpret American Sign Language. Retrieved from the University Digital Conservancy, https://hdl.handle.net/11299/173993.
Content distributed via the University Digital Conservancy may be subject to additional license and use restrictions applied by the depositor. By using these files, users agree to the Terms of Use. Materials in the UDC may contain content that is disturbing and/or harmful. For more information, please see our statement on harmful content in digital repositories.