Differences

This shows you the differences between two versions of the page.

Link to this comparison view

poppy-kine:poppy-kine-2015-s5 [2016/06/10 07:09]
s4poppy2016 [Second step: the movement] folder path
poppy-kine:poppy-kine-2015-s5 [2019/04/25 14:08]
Line 1: Line 1:
-{{tag>​Poppy Project}} 
-====== Poppy-Kine : S5 project 2015-2016 ====== 
  
-===== Overview ===== 
- 
-{{http://​breddydotorg.files.wordpress.com/​2013/​07/​shoulder-flexion-and-extension.png?​100}} 
-The main objective of this project is to help patients with **functional reeducation** exercices. 
- 
-Patient A's kinesitherapist will show him/her some movements that A will need to repeat more or less regularly, //at home and without the doctor//​. ​ 
-\\ Therefore, for better memorization,​ the movement will be done by a **Poppy Robot** (see [[http://​poppy-project.org/​|Poppy project]] and [[poppy|Poppy robot wiki page]]). 
-\\ First, the moves are done in front of a **Kinect camera** (v2).  
-\\ The movements (captured by position of the Skeleton) are saved, and replayed by the robot. 
-\\ The patient will then have to reproduce the movement (in front of the Kinect), which will have to be repeated or not (according to the quality of the repetition). 
- 
----- 
-===== How it works ===== 
- 
-  - (Zero-th step : installing Kinect camera and //​pyKinect//​ (Python library) on the computer. Already installed in the lab's computer) 
-  - First step : capturing the Skeleton (Kinect v2) as **cartesian** coordinates (x,y,z) 
-  - Second step : converting the Skeleton data into Poppy robot'​s referential (**angular**) 
-\\ __**NB : **__  ​ 
-  * If you want to **make Poppy rigid**, we wrote a program that puts it to the //zero position// (ie all the motors'​ angles are set to zero): **Init_poppy.py** 
-  * Don't forget to **plug Poppy** ! 
- 
-==== First step: getting the Skeleton ==== 
-{{:​kinect_skeleton.png?​200 |}} 
-  * Plug the Kinect camera to the computer **before** turning it on (or you will probably need to restart the computer) 
-  * Connect the Kinect camera 
-\\  
-\\  
-\\  
-The Kinect recovers the position of the following joints: 
-  * Unordered List Item 
-{{https://​naokinect.files.wordpress.com/​2012/​02/​image_thumb_6f4828ec1.png}} 
-==== Second step: the movement ==== 
-All the programs we wrote are in **C:​\Users\ihsev\Documents\projetS5\poppy-humanoid\software\poppy_humanoid/​** folder 
-=== First option: manually save, convert, then play the movement=== 
-  * Save the movement 
-<​hidden>​ Click **save.bat** 
-\\ Enter the //name of the exercise// as asked by the shell. 
-\\ A new window should appear, with the video captured by the Kinect camera, and the skeleton of the person in front of it (The first time, it can take a while... be patient). 
-\\ Movements are separated by //space key// pressures. 
-\\  
-\\  For example: 
-  * //space key// : starts the capture of the first movement, 
-  * do the movement... 
-  * //space key// : stops the capture of the first movement, 
-  * //space key// : starts the capture of the second movement, 
-  * etc 
-The movements are then saved in **⁄exercices⁄<​name_of_the_exercise>/​** folder, in <​name_of_the_exercise_x>​.txt file. 
-\\ x starts from zero (first movement). 
-\\ To finish capturing the movements, just close the window (click on the red cross..). 
-</​hidden>​ 
- 
-  * Convert the angles 
-<​hidden>​ Click **convert.bat** 
-\\ Enter the name of the exercise (name of the folder) as asked by the shell 
-\\ The program will convert all the movements saved in the corresponding folder 
-\\ The cartesian coordinates and the quaternions captured by the Kinect are saved in .txt files 
-\\ When the program converts the coordinates into angles for Poppy, it saves new files :  
-\\ For example, //​movement_0.txt//​ will be converted in a new file //​movement_0_poppy.txt//​ in the same folder 
-</​hidden> ​ 
- 
-  * Make Poppy play the movement 
-<​hidden>​ Click **play.bat** 
-\\ Enter the name of the exercise as asked by the shell: for example //​movement//​ 
-\\ Enter the number of the exercise as asked by the shell: for example //0// if you want to play the first movement 
-</​hidden> ​ 
-\\  
- 
-=== Second option: Save then play the movement=== 
-<​hidden>​ Click **save_and_play.bat** 
-\\ Enter the name of the exercise as asked by the shell. 
-\\ A new window should appear, with the video captured by the Kinect camera, and the skeleton of the person in front of it (The first time, it can take a while... be patient). 
-\\ Movements are separated by //space key// pressures. 
-\\  
-\\  For example: 
-  * //space key// : starts the capture of the first movement, 
-  * do the movement... 
-  * //space key// : stops the capture of the first movement, 
-  * //space key// : starts the capture of the second movement, 
-  * etc 
-To finish capturing the movements, just close the window (click on the red cross..). 
-The movements are then saved in **⁄exercices⁄<​name_of_the_exercise>/​** folder, in <​name_of_the_exercise_x>​.txt file. 
-\\ x starts from zero (first movement). 
-\\ The program will convert the angles into Poppy'​s referential. 
-\\ Poppy can then play the movement, just specify which one (on the prompt). 
-</​hidden>​ 
-\\  
----- 
-===== How we proceeded ===== 
-All the programs we wrote are stored in **projets5/​poppy_humanoid/​software/​poppy_humanoid/​** folder 
-==== First step: getting the Skeleton ==== 
-We used the //​pykinect2//​ library in Python. 
-\\ The code is written in **poppy_kinect_save_mouvement.py** file 
-\\ For each movement, a .txt file is saved in **⁄exercices⁄<​name_of_the_exercise>/​** folder, in <​name_of_the_exercise_x>​.txt file (x starts from zero (first movement)). 
-\\ A .txt file is a //Python dictionnary//​ stored with //json// format (in order to save it as a dictionnary and not just some text, and treat the data for the conversion) 
-\\  
-\\ The **cartesian coordinates** are written in the subdictionnary **"​positions"​**,​ and each time unit (depends on the framerate) has its own dictionnary 
-\\ The **quaternions** are written in the subdictionnary **"​orientations"​**,​ and each time unit (depends on the framerate) has its own dictionnary 
-\\  
-\\ Basically, the structure of a .txt file (ie the dictionnary that stores the movement) is:  
-<​code>​ 
-{ 
-"​framerate":​ "<​framerate>", ​ 
-"​orientations":​ { "<​time1>":​ { "<​joint1>":​ [x,y,z,w], "<​joint2>":​ [x,y,z,w] }, "<​time2>":​ { "<​joint1>":​ [x,y,z,w], "<​joint2>":​ [x,y,z,w] } }, 
-"​positions":​ { "<​time1>":​ { "<​joint1>":​ [x,y,z], "<​joint2>":​ [x,​y,​z] ​ }, "<​time2>":​ { "<​joint1>":​ [x,y,z], "<​joint2>":​ [x,y,z] } } 
-} 
-</​code> ​       
- 
-==== Second step: convert the movement ==== 
-\\ The Kinect camera captured the cartesian coordinates of the person who was doing the movement. However, to make Poppy play a movement, you need to "​send"​ it angles for each of its motors. 
-\\  
-\\ "​Poppy,​ move your //​r_shoulder_x//​ motor by 30 degrees (please)!"​ 
-\\  
-\\ Well, you need to **convert the cartesian coordinates into angles**. How ?! 
-<​hidden>​ 
-{{:​anglesbras.png?​200 |}} 
-\\ Take 4 points: we can define 2 vectors. ​ 
-\\ You can then calculate the angles between 2 vectors thanks to Euclidean geometry (the python functions needed are defined in **calculangles.py** file). 
-\\ Here, the main difficulty is not to calculate the angles; but to find __which__ vectors to choose. Indeed, we need to consider all the possible configurations in 3D space and find the good formula.</​hidden>​ 
-\\ You can find our approximations in **Kinect_to_Poppy_angles.py** file.  
-\\  
-\\ For each movement, a .txt file is saved in //​⁄exercices⁄<​name_of_the_exercise>///​ folder, in **<​name_of_the_exercise_x>​_poppy.txt** file (x starts from zero (first movement)). 
-\\ A .txt file is a //Python dictionnary//​ stored with //json// format (in order to save it as a dictionnary and not just some text, and treat the data for Poppy to play the movement) 
-\\ Basically, the structure of a .txt file (ie the dictionnary that stores the movement) is:  
-<​code>​ 
-{ 
-"​framerate":​ "<​framerate>", ​ 
-"​positions":​ { "<​time1>":​ { "<​motor1>":​ [angle,0], "<​motor2>":​ [angle,​0] ​ }, "<​time1>":​ { "<​motor1>":​ [angle,0], "<​motor2>":​ [angle,0] } } 
-} 
-</​code> ​ 
-\\  
-<​hidden>​ Another way to get the angles: use the **quaternions** 
-\\ A more accurate way to get the angles is to use the quaternions captured by the Kinect camera. 
-\\ It needs some strong geometrical skills, and unfortunately we did not manage to get to it successfully. 
-</​hidden>​ 
----- 
-\\  
-===== For further improvements ===== 
-The angles we calculated are unfortunately not accurate enough for complex movements. That is why using quaternions can be another way to improve the conversion. 
-\\ Moreover, GMM (Gaussian mixture models) and GMR (Gaussian mixture regressions) can be used to synthetize samples of the same movement, and make Poppy then the patient reproduce it. 
  • poppy-kine/poppy-kine-2015-s5.txt
  • Last modified: 2019/04/25 14:08
  • (external edit)