-
Notifications
You must be signed in to change notification settings - Fork 7
Getting started
This tutorial builds upon the pyomyo Getting started tutorial, I recommend following it to play the Chrome Dino Game with the Myo first.
- Plug the Myo dongle into your PC, make sure the Myo is charged and wear it to let it warm up.
- Git clone this repo and run
pip install -r requirements.txt
- Git clone pyomyo locally.
- Go into the cloned directory and use
pip uninstall pyomyo
thenpip install .
to install your local developer version of pyomyo. - Install OpenGloves on Steam. OpenGloves overrides SteamVR inputs, so remember to disable it when you want to use your normal controllers.
- Open OpenGloves and change the Communication Method to NamedPipe, make sure to enable OpenGloves for the hand you want to control with the Myo. (If using the left, you may have to change the pipe in code from right to left.)
- Start SteamVR and a game with finger tracking support such as Half Life Alyx. Make sure Myo Connect is not open and your dongle is plugged in.
- Put your Myo on and let it warm up.
- Confirm OpenGloves is getting the data using
python opengloves_tester.py
which allows you control SteamVR hands using a GUI. - To try proportional grasp control run:
python predictor_grasp.py
which closes your hand depending on values from the Myo.
It won't work well as we need to train a model for complex tasks like finger classification. - To try a finger classifier run:
python predictor_basic.py
and follow the instructions below.
The video above shows quick training of a finger classifier model using predictor_basic.py
. You'll notice it's a modified dino_jump.py
which is why you should start there.
Each number key on the keyboard corresponds to both a class and a finger. So resting the hand and pressing 0 will label incoming EMG data of a resting hand as resting. The pose handler takes in the predicted class number and then sends the appropriate signal to SteamVR.
predictor_basic.py
assumes the mapping:
Class 0 - Rest
Class 1 - Thumb curl
Class 2 - Index curl
Class 3 - Middle curl
Class 4 - Ring curl
Class 5 - Pinky curl
Class 6 - Close fist
Start by closing resting one hand and using your other hand to label the data as resting (press 0).
Curl your thumb and press 1, if you have Half Life Alyx and OpenGloves running, you should see your virtual thumb curl as your classifier predicts 1.
Continue labelling using the above mapping.
If you see no finger movement in Alyx, try running python opengloves_tester.py
and confirm OpenGloves is getting commands over the NamedPipe.
If not, try the Troubleshooting steps.
Feel free to ask for further help in the Discord here.
Yes, but you need to gather labelled regression data and then train a neural network.
predictor_basic.py
was made as a simplified abstraction. There are many examples of gesture classification, which is much easier than gesture regression. One problem is the difficulty of gathering regression labels, my initial solution was to use the Leap Motion which started the NeuroLeap project.
The Leap Motion provides a high dimensional data stream representing the hand and therefore can produce a high dimensional prediction (if you can get past the curse of dimensionality and all the other machine learning problems involved in using 8 channels to predict 22). However is this what people actually want or need now? In rehabilitation, boolean grab detection has proved greatly useful. Seeing how effective LucidGloves are convinced me that just predicting the finger curl for each finger was good enough, this is why NeuroGloves was made.
To use NeuroGloves to make a model:
0. Put on a Lucid Glove, I use the 3.1 prototype with an Arduino Nano.
- Set the COM port inside of
data_collector.py
to the COM port of the dataglove. (Check device manager on Windows.) Also set the Myo mode, filename and duration to best represent your current experiment. Note the mode and duration are key, too long of a duration and you will fatigue. - Make sure you have closed down all the programs using that COM port or you will get a serial error.
- Run
data_collector.py
to gather EMG data combined with serial data from the dataglove. - Wait until the experiment is over and the data is saved as a csv.
- Make a backup of that CSV and analyse it in a Jupyter notebook, like this one.
- Select the correct preprocessing steps and make a good model.
- Save the model and any scalers used.
- Replicate the preprocessing, open and run the chosen scalers and model in a prediction script like this one.
- If it doesn't predict accurately when live but does in the notebook, check for errors in the prediction script and inconsistencies between your train and testing environments, fix them and then go back to step 3.
- If it works for you now, great. Wait a day and try again. Solving cross session validation is key. You may want to look into transfer learning or using calibration each time you try to use it.
- If it still works, even better, now give it to someone else and see if it works?
Extra points for trying it outside in the snow where the metal electrodes will act differently to inside. If it works awesome, you are well on your way to changing HCI history! - Try someone else, extra points if they have a thinner or thicker wrist than you.
If you only train your model on data on people who are like you then it will only work for them not anyone else.
Making a solution that works for everyone is a very very hard problem. You may be tempted to gather this biological data on everyone you know, however this data could be used to uniquely identify you or worse, we don't know what wonderful or horrible things it could be used to do. This is why I did not gather data on anyone else. - If you've managed to make something that works well on anyone without a huge amount of data gathering then, please get in touch! (Especially if you want a job. 😅 )