Sievert, Scott2015-08-212015-08-212015https://hdl.handle.net/11299/173993Language obstacles can often exist when hearing impaired individuals interact with others who are not well-versed in sign language. We propose and develop a system that utilizes commercially available, wireless armbands that are capable of detecting acceleration, orien- tation and electromyography (muscle activity) data. Our interpretation system is designed to translate gestures in a \word-by-word" manner by assuming that each sign language sign corresponds to a unique English word, which eliminates inter-word dependence and makes this a classi cation problem. Using convolutional neural networks as our interpretation al- gorithm, we achieve high classi cation accuracy of roughly 90% for a dictionary of 7 words as well as a \null" word.enCum LaudeElectrical EngineeringCollege of Science and EngineeringGesture: A Cyberphysical system to interpret American Sign LanguageThesis or Dissertation