Patients who lost a hand in an accident use robotic hands, but it is hard to move them naturally. That is because robotic hands perform only preprogrammed motions and cannot handle unexpected situations. It is difficult to grasp fragile objects such as eggs or paper cups with the right amount of force.
Scientists solved this problem by having humans and artificial intelligence (AI) share information. A human and AI each control the robotic hand, with whichever side is better taking the lead while the other steps back or assists. With AI's help, users safely picked up eggs with the robotic hand without prior training and gently held paper cups filled with water.
◇ Combining electromyography control and AI simulation
Jacob George, a professor in the Department of Electrical and Computer Engineering at the University of Utah, and Marshall Trout, along with their research team, said on the 10th in the international journal Nature Communications that they "implemented natural movements with a robotic hand by mounting distance and pressure sensors on the fingertips of a commercially available prosthetic hand and combining them with AI's control capabilities."
The team tested the new robotic hand control technology with four patients who lost a hand in accidents. For the robotic prosthetic hand, they used a product from TASKA Prosthetics, a New Zealand prosthetics company. TASKA's robotic prosthetic hand operates by sensing electromyography (EMG) electrical signals generated in the user's remaining arm muscles. Thanks to collaboration with AI, users safely grasped various objects without prior training.
TASKA's robotic hand senses patterns of the user's muscles contracting or relaxing through sensors. With this information, it infers what movement the user wants and moves the prosthetic fingers in a preprogrammed way accordingly. Although it moves as the user intends, the robotic hand's motions and forces do not have many variations. For example, it is hard to change finger shapes and force to suit an egg versus a baseball.
The researchers solved this problem with AI. First, they mounted an infrared sensor that measures distance and a pressure sensor that detects an object's firmness on the prosthetic's fingertips. The distance sensor detects the distance to an object from 0 to 1.5 cm. Thanks to this, it even notices when a nearly weightless cotton ball is falling. The pressure sensor measures up to 35 newtons (N), the force exerted by a 3.5 kg object.
In advance, the AI simulated scenarios of grasping various objects based on data from the fingertip sensors. After machine learning, the AI could move the fingers into specific shapes to the distance needed to grasp an object perfectly. It used the whole fingers to grab a ball but three fingers to hold an egg. It pinched paper with the thumb and index finger.
◇ "Aiming to implement touch and control with thought"
The user and AI do not compete for control of the robotic hand. Trout said, "What we do not want is the user and the machine fighting over control of the robotic hand," adding, "In this study, the machine increased the user's control precision over the robotic hand while also making the task easier."
For example, if the AI creates the optimal finger shape to grasp an object without applying any force, the user cooperates by controlling the appropriate force to hold it. Otherwise, there could be a conflict, such as the user trying to extend the fingers to drop an item while the AI insists on gripping it no matter what.
George said, "This study is part of the larger vision of the University of Utah's NeuroRobotics Lab to help patients with amputations improve their quality of life," adding, "We are also researching control of a robotic prosthetic hand with thought alone."
For patients with paralysis of the limbs, the idea is to decode brain neural signals rather than muscle signals to control a prosthetic hand. George added, "We also plan to develop an implantable neural interface that senses and transmits touch so that an intelligent prosthetic hand moves as intended while even improving tactile sensation."
References
Nature Communications (2025), DOI: https://doi.org/10.1038/s41467-025-65965-9