Register for free and continue reading
Join our growing army of changemakers and get unlimited access to our premium content
Biomedical engineers at Newcastle University created a prosthetic limb that uses AI to respond to the environment, adjusting its grip and movements as needed.
Already being introduced to patients at Newcastle’s Freeman Hospital, the bionic hand with vision capabilities uses artificial intelligence (AI) rather than electrical signals from muscles to control its movements. While prostheses have greatly improved over time through material developments, the methods for controlling them have not greatly advanced. The new prosthesis designed by Newcastle University engineers has been taught to recognize objects using neural networks.
By learning which objects require which type of grasp – depending on size, orientation and shape – the smart limb can immediately make a decision as to which is needed. The camera fitted to the top of the prosthesis views an object, and then the AI system determines which grasp is needed and tells the hand to move accordingly. The entire process takes only milliseconds, which is ten times faster than the current method, which requires the user to concentrate on the object and work the muscles connected to the limb to achieve the desired action.
Other projects working to improve prostheses include a solar powered skin that could return the sensation of touch to amputees, and a connected wristband that provides amputees with the fine motor movements needed to work on a computer. How else could smart cities incorporate these ideas and new technologies to help make public services more accessible?
Please login or Register to leave a comment.