Difference between revisions of "ASL-Robot"

From Lofaro Lab Wiki
Jump to: navigation, search
(Code)
(Code)
Line 15: Line 15:
 
==Code==
 
==Code==
 
* [[Embedded System Control and Setup]]
 
* [[Embedded System Control and Setup]]
* [[Data Smoothing and Position Mapping for Mo-cap Data]]
+
* [[Data Smoothing and Position Mapping]]
  
 
==Videos==
 
==Videos==

Revision as of 20:49, 10 May 2016

ASL body.jpg

The ASL-Robot, or American Sign Language Robot, is a prototype robot designed for the purpose of providing deaf children with an educational sign language resource. The arms are designed to have 80% the dexterity of a typical human and to perform signs at a rate of 2.3 signs per second. The robot consists of both 3D printed parts as well as machined components constructed from 6061 Aluminum. The arms and hands are assembled using servos from the dynamixel series. Each hand consists of 11 XL-320 servos daisy chained together, while each arm consists of an AX-12A, MX-28T, MX-64T, and 5 MX-106T's daisy chained together. The arms are mounted to a wooden stand. The robot functions using an MSP-432 Microcontroller. (add more)

A major goal of this project is to teach the robot signs via mapping from motion capture using an RGB-D sensor. From motion capture, the position data for the upper body joints, facial features and time are recorded for any particular sign. The raw data from the sensor is imported into MATLAB to be stabilized making the motion smoother. Then, the smoothed data is mapped onto the robot using its measurements.


Tutorials

Code

Videos

Building the ASL-Robot