ASL-Robot

From Lofaro Lab Wiki
Revision as of 10:28, 11 May 2016 by Agoldsto (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

ASL body.jpg

The ASL-Robot, or American Sign Language Robot, is a prototype robot designed for the purpose of providing deaf children with an educational sign language resource. The arms are designed to have 80% the dexterity of a typical human and to perform signs at a rate of 2.3 signs per second. The robot consists of both 3D printed parts as well as machined components constructed from 6061 Aluminum. The arms are mounted to a wooden stand. The arms and hands are assembled using servos from the dynamixel series. Each hand consists of 11 XL-320 servos daisy chained together, while each arm consists of an AX-12A, MX-28T, MX-64T, and 5 MX-106T's daisy chained together. The servos were to receive power and communicate with the MSP-432 Microcontrollers by receiving commands and feeding back servo data through a PCB board that incorporates all necessary power regulation and digital logic.

Through the MSP432s, the main controller has an interface to interact with the motors while it is running other programs, such as the user interface and collision software. Once movement commands are sent to the MSP432s, they convert the radian values they receive from the main controller to tick values which the motors can understand, and then send this data to the motors according to their own defined protocol. Once they send out a movement, they begin reading back positional values from the motors, which can then be sent back to the main controller upon its request to do so. Finally, the MSP432s also implement an emergency stop feature which can hold the motors in their place in the event of an emergency issued by the main controller. Through this emergency stop feature along with the collision software on the main controller, the system is able to mitigate self-collision errors.

A major goal of this project is to teach the robot signs via mapping from motion capture using an RGB-D sensor. From motion capture, the position data for the upper body joints, facial features and time are recorded for any particular sign. The raw data from the sensor is imported into MATLAB to be stabilized making the motion smoother. Then, the smoothed data is mapped onto the robot using its measurements.


Tutorials

Code

Videos

Hand Speed Test

20 Letters of ASL Alphabet

Dexterity Test

Machining Tutorial

Building the ASL-Robot