Difference between revisions of "ASL-Robot"

From Lofaro Lab Wiki
Jump to: navigation, search
(Code)
(Code)
 
(8 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
[[File:ASL body.jpg]]
 
[[File:ASL body.jpg]]
  
The ASL-Robot, or American Sign Language Robot, is a prototype robot designed for the purpose of providing deaf children with an educational sign language resource. The arms are designed to have 80% the dexterity of a typical human and to perform signs at a rate of 2.3 signs per second. The robot consists of both 3D printed parts as well as machined components constructed from 6061 Aluminum. The arms and hands are assembled using servos from the dynamixel series. Each hand consists of 11 XL-320 servos daisy chained together, while each arm consists of an AX-12A, MX-28T, MX-64T, and 5 MX-106T's daisy chained together. The arms are mounted to a wooden stand. The robot functions using an MSP-432 Microcontroller. (add more)
+
The ASL-Robot, or American Sign Language Robot, is a prototype robot designed for the purpose of providing deaf children with an educational sign language resource. The arms are designed to have 80% the dexterity of a typical human and to perform signs at a rate of 2.3 signs per second. The robot consists of both 3D printed parts as well as machined components constructed from 6061 Aluminum. The arms are mounted to a wooden stand. The arms and hands are assembled using servos from the dynamixel series. Each hand consists of 11 XL-320 servos daisy chained together, while each arm consists of an AX-12A, MX-28T, MX-64T, and 5 MX-106T's daisy chained together. The servos were to receive power and communicate with the MSP-432 Microcontrollers by receiving commands and feeding back servo data through a PCB board that incorporates all necessary power regulation and digital logic.
 +
 
 +
Through the MSP432s, the main controller has an interface to interact with the motors while it is running other programs, such as the user interface and collision software. Once movement commands are sent to the MSP432s, they convert the radian values they receive from the main controller to tick values which the motors can understand, and then send this data to the motors according to their own defined protocol. Once they send out a movement, they begin reading back positional values from the motors, which can then be sent back to the main controller upon its request to do so. Finally, the MSP432s also implement an emergency stop feature which can hold the motors in their place in the event of an emergency issued by the main controller. Through this emergency stop feature along with the collision software on the main controller, the system is able to mitigate self-collision errors.  
  
 
A major goal of this project is to teach the robot signs via mapping from motion capture using an RGB-D sensor. From motion capture, the position data for the upper body joints, facial features and time are recorded for any particular sign. The raw data from the sensor is imported into MATLAB to be stabilized making the motion smoother. Then, the smoothed data is mapped onto the robot using its measurements.
 
A major goal of this project is to teach the robot signs via mapping from motion capture using an RGB-D sensor. From motion capture, the position data for the upper body joints, facial features and time are recorded for any particular sign. The raw data from the sensor is imported into MATLAB to be stabilized making the motion smoother. Then, the smoothed data is mapped onto the robot using its measurements.
Line 9: Line 11:
 
* [[Kinect Mo-cap Tutorial]]
 
* [[Kinect Mo-cap Tutorial]]
 
* [[Gazibo Simulation Tutorial]]
 
* [[Gazibo Simulation Tutorial]]
* [[PCB Design Tutorial]]
+
* [[Circuit Design Tutorial]]
* [[3D Printing Tutorial]]
+
* [http://wiki.lofarolabs.com/index.php/3D_Printer_Guide 3D Printing Tutorial]
 
* [[Aluminum Machining Tutorial]]
 
* [[Aluminum Machining Tutorial]]
  
 
==Code==
 
==Code==
 +
* [https://www.dropbox.com/sh/foj73qtu62fmhnr/AABqVRjED1zy2-upWYiTLh8Ja?dl=0 ASL Robot Part Files]
 
* [[Embedded System Control and Setup]]
 
* [[Embedded System Control and Setup]]
* [[Data Smoothing and Position Mapping for Mo-cap Data]]
+
* [[Data Smoothing and Position Mapping]]
  
 
==Videos==
 
==Videos==
  
 +
[https://www.youtube.com/watch?v=u_AHaE-OjJc Hand Speed Test]
 +
 +
[https://www.youtube.com/watch?v=fpg0jjAH6YE 20 Letters of ASL Alphabet]
 +
 +
[https://www.youtube.com/watch?v=LMCEudppmnU&feature=youtu.be Dexterity Test]
 +
 +
[https://www.youtube.com/watch?v=MZjzHlPi984 Machining Tutorial]
  
 
==Building the ASL-Robot==
 
==Building the ASL-Robot==

Latest revision as of 11:28, 11 May 2016

ASL body.jpg

The ASL-Robot, or American Sign Language Robot, is a prototype robot designed for the purpose of providing deaf children with an educational sign language resource. The arms are designed to have 80% the dexterity of a typical human and to perform signs at a rate of 2.3 signs per second. The robot consists of both 3D printed parts as well as machined components constructed from 6061 Aluminum. The arms are mounted to a wooden stand. The arms and hands are assembled using servos from the dynamixel series. Each hand consists of 11 XL-320 servos daisy chained together, while each arm consists of an AX-12A, MX-28T, MX-64T, and 5 MX-106T's daisy chained together. The servos were to receive power and communicate with the MSP-432 Microcontrollers by receiving commands and feeding back servo data through a PCB board that incorporates all necessary power regulation and digital logic.

Through the MSP432s, the main controller has an interface to interact with the motors while it is running other programs, such as the user interface and collision software. Once movement commands are sent to the MSP432s, they convert the radian values they receive from the main controller to tick values which the motors can understand, and then send this data to the motors according to their own defined protocol. Once they send out a movement, they begin reading back positional values from the motors, which can then be sent back to the main controller upon its request to do so. Finally, the MSP432s also implement an emergency stop feature which can hold the motors in their place in the event of an emergency issued by the main controller. Through this emergency stop feature along with the collision software on the main controller, the system is able to mitigate self-collision errors.

A major goal of this project is to teach the robot signs via mapping from motion capture using an RGB-D sensor. From motion capture, the position data for the upper body joints, facial features and time are recorded for any particular sign. The raw data from the sensor is imported into MATLAB to be stabilized making the motion smoother. Then, the smoothed data is mapped onto the robot using its measurements.


Tutorials

Code

Videos

Hand Speed Test

20 Letters of ASL Alphabet

Dexterity Test

Machining Tutorial

Building the ASL-Robot