Machine learning technology is the key to a new robotic hand that responds to the wearer’s thoughts. The 3-D-printed hand was developed by researchers at Hiroshima University in Japan with the hope that it could be useful for those who have lost limbs for various reasons.
Electrodes in the hand pick up electromyogram (EMG) signals from the wearer’s skin — these signals are involuntary muscle contractions that reflect a movement the brain is thinking about making. These signals are interpreted by a machine learning algorithm, which then tells the motors in the hand what movement to make. The entire process takes about five milliseconds.
In order to demonstrate the hand’s abilities, the researchers had it play rock-paper-scissors, pick up water bottles, and shake someone’s hand. It still has a long way to go before it could come to market (for example, its maximum grasping force is far below the human average), but the team is looking for a commercial partner who could help them develop it further.