-
Notifications
You must be signed in to change notification settings - Fork 7
Home
Welcome to the NeuroGloves wiki! This repo contains the culmination of my undergraduate dissertation. A year was spent on trying to predict finger positions from EMG data using the Myo armband.
Over the last year I have made:
-
pyomyo
Pure Python, multithreaded, cross platform driver for the Thalmic Myo, the first to use all three Myo Modes, including the secret one. -
NeuroBreakout
An example showing how to use the Myo for 1D regression. -
MatplotLeap
Higher dimension regression needs good labels, the Leap Motion was one I tried. MatPlotLeap shows how to use the Leap with Python. -
NeuroLeap
Models and tools for predicting LeapMotion data from EMG. -
NeuroGloves (This repo)
Models and tools for predicting 5 finger curl using EMG and LucidGloves for labelling. -
pygloves
Parts of NeuroGloves I thought would be useful for LucidGloves users.
Training a model using the Leap Motion has a few problems:
- Some assumptions made by the Leap Motion will be learned by your model without you knowing.
- The Leap Motion API has been deprecated for Python since the V4 SDK, it's closed source and a pain to work with.
- The Leap Motion gathers high dimensional data, this may be what you want but it makes the ML harder and requires you to gather more data or get clever.
For many reasons including these, I switched over to using datagloves for labelling and therefore moved from NeuroLeap to working on NeuroGloves. This approach also has some problems:
- The glove is different for each user.
- By default, the gloves calibrate themselves which complicates the ML and analysis.
- It can take time and money to make a glove, but buying a Leap Motion is easy and possibly cheaper.
- The data glove only captures 5 finger flexion, which is enough for SteamVR interactions but maybe detailed interactions are needed.
Using a dataglove:
A. Pallotti, G. Orengo, and G. Saggio, “Measurements comparison of finger joint angles in hand postures between an sEMG armband and a sensory glove,” Biocybernetics and Biomedical Engineering, vol. 41, no. 2, pp. 605–616, Apr. 2021. - Paper link
Using a Leap Motion:
NeuroPose: 3D Hand Pose Tracking using EMG Wearables - Paper link
F. Quivira, T. Koike-Akino, Y. Wang, and D. Erdogmus, “Translating sEMG signals to continuous hand poses using recurrent neural networks,” in 2018 IEEE EMBS International Conference on Biomedical Health Informatics (BHI), Mar. 2018, pp. 166–169. - Paper link
I. Sosin, D. Kudenko, and A. Shpilman, “Continuous Gesture Recognition from sEMG Sensor Data with Recurrent Neural Networks and Adversarial Domain Adaptation,” in 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Nov. 2018, pp. 1436–1441. - Paper link