| Literature DB >> 28261002 |
Joseph DeGol1, Aadeel Akhtar1, Bhargava Manja1, Timothy Bretl1.
Abstract
In this paper, we demonstrate how automatic grasp selection can be achieved by placing a camera in the palm of a prosthetic hand and training a convolutional neural network on images of objects with corresponding grasp labels. Our labeled dataset is built from common graspable objects curated from the ImageNet dataset and from images captured from our own camera that is placed in the hand. We achieve a grasp classification accuracy of 93.2% and show through real-time grasp selection that using a camera to augment current electromyography controlled prosthetic hands may be useful.Entities:
Mesh:
Year: 2016 PMID: 28261002 PMCID: PMC5325038 DOI: 10.1109/EMBC.2016.7590732
Source DB: PubMed Journal: Annu Int Conf IEEE Eng Med Biol Soc ISSN: 2375-7477