Differences

This shows you the differences between two versions of the page.

Link to this comparison view

robocup:robotic_arm [2016/02/24 12:47]
nfave [Instructions]
robocup:robotic_arm [2019/04/25 14:08]
Line 1: Line 1:
-====== Robotic Arm ====== 
-{{tag>​project robot}} 
- 
-The robotic arm is part of the robocup project. Its purpose is to detect and catch specifics objects and displace them to another position. 
- 
-The arm is composed of 4 used dynamixels (two EX-106+, one RX64 and one MX28) who communicate via the USB2Dynamixel device. 
- 
-INSERT ARM PHOTO HERE 
- 
-The hand was inspired by a real human hand. There are two tendons for each fingers, allowing a lot of movement. It is composed of 12 servomotors DS95 (only 10 are actually used). 
- 
-INSERT HAND PHOTO HERE 
- 
-===== Requirements ===== 
- 
-The arm must be connected to a computer using a Linux Operating System. The project also needs [[ROS]] installed with the Indigo distro. 
- 
-You must also have these ROS packages installed :  
-  * [[robotic_arm package]] 
-  * [[robotic_arm_controler package]] 
-  * [[objects_detection package]] 
-  * [[kinect_aux_robotic_arm package]] 
- 
-===== Connections ===== 
- 
-{{::​schema_electronique_bras.png?​1000|}} 
- 
-==== Maple mini connections ==== 
-{{::​maple_mini_connections.png|}} 
-==== Hand connections ==== 
- 
-{{::​2016-02-23_11.31.24.jpg?​300|}} 
-{{:​2016-02-23_11.31.40.jpg?​300|}} 
-{{:​2016-02-23_11.31.51.jpg?​300|}} 
- 
-===== Instructions ===== 
-{{tag>​tutorial}} 
- 
-  * Connect the Dynamixel2USB device and the maple mini to the computer. 
-  * Connect the Kinect to the computer. 
-  * Launch openni.launch from openni_launch package 
-  * Run kinect_aux_robotic_arm from [[kinect_aux_robotic_arm package]] 
-  * Set the angle of the kinect at -50° using [[kinect_aux_robotic_arm package]]. 
-  * Supply the arm and the hand. You must use a laptop charger for the arm, and a 6V external supplier for the hand (the current can go up to 4A during the movement) 
-  * Launch the robotic_arm.launch from [[opennilaunch | openni_launch package]]. 
-  * Launch the robotic_arm.launch from [[robotic_arm_controler package]]. 
-  * At this point, you can enter the following command in order to see the objects detected by the kinect : 
- 
- rosrun image_view image_view image:​=/​object_detection/​red_objects ​ 
- 
-  * If at least one object is detected, you can decide to move it by giving the target coordinates for the object : 
- 
- rostopic pub -1 /​arm_controler/​new_coordinates geometry_msgs/​Pose "​{position:​[X,​ Y, Z], orientation:​[0,​ 0, 0, 0]}" 
- 
-__''​**WARNING**''​__ 
- 
-During the test, be sure to check if none of the dynamixel red led is on, and the current of the external supplier is no more than 6A. If that's the case, unplug the chargers. Moreover, be careful when unplugging the arm charger (laptop charger) because the arm will fall so hold it before unplug it. 
- 
-===== Source code ===== 
-{{tag> software}} 
- 
-Code for the maple mini. It open or close the hand each time it receive an instruction via usb serial port. 
- 
-{{:​maple_mini.zip|}} 
- 
-Code::​Blocks project developed on Ubuntu OS. It is used to control the arm (old, it is better to use the ROS packages) 
- 
-{{:​bras_robotique_console.zip|}} 
- 
  
  • robocup/robotic_arm.txt
  • Last modified: 2019/04/25 14:08
  • (external edit)