This is an old revision of the document!


Poppy-Kine : S5 project 2015-2016

The main objective of this project is to help patients with functional reeducation exercices.

Patient A's kinesitherapist will show him/her some movements that A will need to repeat more or less regularly, at home and without the doctor.
Therefore, for better memorization, the movement will be done by a Poppy Robot (see Poppy project and Poppy robot wiki page).
First, the moves are done in front of a Kinect camera (v2).
The movements (captured by position of the Skeleton) are saved, and replayed by the robot.
The patient will then have to reproduce the movement (in front of the Kinect), which will have to be repeated or not (according to the quality of the repetition).


  1. (Zero-th step : installing Kinect camera and pyKinect (Python library) on the computer. Already installed in the lab's computer)
  2. First step : capturing the Skeleton (Kinect v2) as cartesian coordinates (x,y,z)
  3. Second step : converting the Skeleton data into Poppy robot's referential (angular)


NB :

  • If you want to make Poppy rigid, we wrote a program that puts it to the zero position (ie all the motors' angles are set to zero): Init_poppy.py
  • Don't forget to plug Poppy !

First step: getting the Skeleton

  • Plug the Kinect camera to the computer before turning it on (or you will probably need to restart the computer)
  • Connect the Kinect camera




The Kinect recovers the position of the following joints:

  • Unordered List Item

Second step: the movement

All the programs we wrote are in C:\Users\ihsev\Documents\projetS5\poppy-humanoid\software\poppy_humanoid/ folder

First option: manually save, convert, then play the movement

  • Save the movement

Click to display ⇲

Click to hide ⇱

Click save.bat
Enter the name of the exercise as asked by the shell.
A new window should appear, with the video captured by the Kinect camera, and the skeleton of the person in front of it (The first time, it can take a while… be patient).
Movements are separated by space key pressures.

For example:

  • space key : starts the capture of the first movement,
  • do the movement…
  • space key : stops the capture of the first movement,
  • space key : starts the capture of the second movement,
  • etc

The movements are then saved in ⁄exercices⁄<name_of_the_exercise>/ folder, in <name_of_the_exercise_x>.txt file.
x starts from zero (first movement).
To finish capturing the movements, just close the window (click on the red cross..).

  • Convert the angles

Click to display ⇲

Click to hide ⇱

Click convert.bat
Enter the name of the exercise (name of the folder) as asked by the shell
The program will convert all the movements saved in the corresponding folder
The cartesian coordinates and the quaternions captured by the Kinect are saved in .txt files
When the program converts the coordinates into angles for Poppy, it saves new files :
For example, movement_0.txt will be converted in a new file movement_0_poppy.txt in the same folder

  • Make Poppy play the movement

Click to display ⇲

Click to hide ⇱

Click play.bat
Enter the name of the exercise as asked by the shell: for example movement
Enter the number of the exercise as asked by the shell: for example 0 if you want to play the first movement


Second option: Save then play the movement

Click to display ⇲

Click to hide ⇱

Click save_and_play.bat
Enter the name of the exercise as asked by the shell.
A new window should appear, with the video captured by the Kinect camera, and the skeleton of the person in front of it (The first time, it can take a while… be patient).
Movements are separated by space key pressures.

For example:

  • space key : starts the capture of the first movement,
  • do the movement…
  • space key : stops the capture of the first movement,
  • space key : starts the capture of the second movement,
  • etc

To finish capturing the movements, just close the window (click on the red cross..). The movements are then saved in ⁄exercices⁄<name_of_the_exercise>/ folder, in <name_of_the_exercise_x>.txt file.
x starts from zero (first movement).
The program will convert the angles into Poppy's referential.
Poppy can then play the movement, just specify which one (on the prompt).



All the programs we wrote are stored in projets5/poppy_humanoid/software/poppy_humanoid/ folder

First step: getting the Skeleton

We used the pykinect2 library in Python.
The code is written in poppy_kinect_save_mouvement.py file
For each movement, a .txt file is saved in ⁄exercices⁄<name_of_the_exercise>/ folder, in <name_of_the_exercise_x>.txt file (x starts from zero (first movement)).
A .txt file is a Python dictionnary stored with json format (in order to save it as a dictionnary and not just some text, and treat the data for the conversion)

The cartesian coordinates are written in the subdictionnary “positions”, and each time unit (depends on the framerate) has its own dictionnary
The quaternions are written in the subdictionnary “orientations”, and each time unit (depends on the framerate) has its own dictionnary

Basically, the structure of a .txt file (ie the dictionnary that stores the movement) is:

{
"framerate": "<framerate>", 
"orientations": { "<time1>": { "<joint1>": [x,y,z,w], "<joint2>": [x,y,z,w] }, "<time2>": { "<joint1>": [x,y,z,w], "<joint2>": [x,y,z,w] } },
"positions": { "<time1>": { "<joint1>": [x,y,z], "<joint2>": [x,y,z]  }, "<time2>": { "<joint1>": [x,y,z], "<joint2>": [x,y,z] } }
}

Second step: convert the movement


The Kinect camera captured the cartesian coordinates of the person who was doing the movement. However, to make Poppy play a movement, you need to “send” it angles for each of its motors.

“Poppy, move your r_shoulder_x motor by 30 degrees (please)!”

Well, you need to convert the cartesian coordinates into angles. How ?!

Click to display ⇲

Click to hide ⇱


Take 4 points: we can define 2 vectors.
You can then calculate the angles between 2 vectors thanks to Euclidean geometry (the python functions needed are defined in calculangles.py file).
Here, the main difficulty is not to calculate the angles; but to find which vectors to choose. Indeed, we need to consider all the possible configurations in 3D space and find the good formula.


You can find our approximations in Kinect_to_Poppy_angles.py file.

For each movement, a .txt file is saved in ⁄exercices⁄<name_of_the_exercise>/ folder, in <name_of_the_exercise_x>_poppy.txt file (x starts from zero (first movement)).
A .txt file is a Python dictionnary stored with json format (in order to save it as a dictionnary and not just some text, and treat the data for Poppy to play the movement)
Basically, the structure of a .txt file (ie the dictionnary that stores the movement) is:

{
"framerate": "<framerate>", 
"positions": { "<time1>": { "<motor1>": [angle,0], "<motor2>": [angle,0]  }, "<time1>": { "<motor1>": [angle,0], "<motor2>": [angle,0] } }
}


Click to display ⇲

Click to hide ⇱

Another way to get the angles: use the quaternions
A more accurate way to get the angles is to use the quaternions captured by the Kinect camera.
It needs some strong geometrical skills, and unfortunately we did not manage to get to it successfully.



The angles we calculated are unfortunately not accurate enough for complex movements. That is why using quaternions can be another way to improve the conversion.
Moreover, GMM (Gaussian mixture models) and GMR (Gaussian mixture regressions) can be used to synthetize samples of the same movement, and make Poppy then the patient reproduce it.

  • poppy-kine/poppy-kine-2015-s5.1490963374.txt.gz
  • Last modified: 2019/04/25 14:08
  • (external edit)