This paper describes a work related to the design and implementation of a communication tool for persons with speech and hearing disabilities. This tool provides to the user a Human-Computer interface capable of the capture and recognition of gestures belonging to the Mexican Spanish Sign Alphabet. To capture the manual expressions, a data-glove constructed to sense the position of fifteen articulations of one of the user’s hand is described. A location system that detects the position and movements of the hand with respect to the user’s body is also constructed. The data-glove and location system signals are processed by a pair of programmable automatons. The automaton’s outputs are sent to a personal computer that realizes the gesture recognition and interpretation tasks. Artificial neural network techniques are utilized to implement the mappings of the space of information generated by the instruments to the interpretation space, where the representation of the gestures are found. Once a gesture is captured and interpreted, it is presented in written form through a screen mounted in the clothes of the user, and in verbal form by a speaker.