These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

139 related articles for article (PubMed ID: 33312264)

  • 1. HANDS: a multimodal dataset for modeling toward human grasp intent inference in prosthetic hands.
    Han M; Günay SY; Schirner G; Padır T; Erdoğmuş D
    Intell Serv Robot; 2020 Jan; 13(1):179-185. PubMed ID: 33312264
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Multimodal fusion of EMG and vision for human grasp intent inference in prosthetic hand control.
    Zandigohar M; Han M; Sharif M; Günay SY; Furmanek MP; Yarossi M; Bonato P; Onal C; Padır T; Erdoğmuş D; Schirner G
    Front Robot AI; 2024; 11():1312554. PubMed ID: 38476118
    [No Abstract]   [Full Text] [Related]  

  • 3. Vision-aided grasp classification: design and evaluation of compact CNN for prosthetic hands.
    Sharma U; Vasamsetti S; Chander SA; Datta B
    Biomed Phys Eng Express; 2024 May; 10(4):. PubMed ID: 38697026
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Deep learning-based artificial vision for grasp classification in myoelectric hands.
    Ghazaei G; Alameer A; Degenaar P; Morgan G; Nazarpour K
    J Neural Eng; 2017 Jun; 14(3):036025. PubMed ID: 28467317
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Automatic Grasp Selection using a Camera in a Hand Prosthesis.
    DeGol J; Akhtar A; Manja B; Bretl T
    Annu Int Conf IEEE Eng Med Biol Soc; 2016 Aug; 2016():431-434. PubMed ID: 28261002
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A Multimodal Intention Detection Sensor Suite for Shared Autonomy of Upper-Limb Robotic Prostheses.
    Gardner M; Mancero Castillo CS; Wilson S; Farina D; Burdet E; Khoo BC; Atashzar SF; Vaidyanathan R
    Sensors (Basel); 2020 Oct; 20(21):. PubMed ID: 33120959
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Cognitive vision system for control of dexterous prosthetic hands: experimental evaluation.
    Dosen S; Cipriani C; Kostić M; Controzzi M; Carrozza MC; Popović DB
    J Neuroeng Rehabil; 2010 Aug; 7():42. PubMed ID: 20731834
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Decoding of unimanual and bimanual reach-and-grasp actions from EMG and IMU signals in persons with cervical spinal cord injury.
    Wolf M; Rupp R; Schwarz A
    J Neural Eng; 2024 Apr; 21(2):. PubMed ID: 38471169
    [No Abstract]   [Full Text] [Related]  

  • 9. Designing Prosthetic Hands With Embodied Intelligence: The KIT Prosthetic Hands.
    Weiner P; Starke J; Rader S; Hundhausen F; Asfour T
    Front Neurorobot; 2022; 16():815716. PubMed ID: 35355833
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Improving Robotic Hand Prosthesis Control With Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping.
    Cognolato M; Atzori M; Gassert R; Müller H
    Front Artif Intell; 2021; 4():744476. PubMed ID: 35146422
    [TBL] [Abstract][Full Text] [Related]  

  • 11. A Multimodal Bracelet to Acquire Muscular Activity and Gyroscopic Data to Study Sensor Fusion for Intent Detection.
    Andreas D; Hou Z; Tabak MO; Dwivedi A; Beckerle P
    Sensors (Basel); 2024 Sep; 24(19):. PubMed ID: 39409254
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Exploiting Robot Hand Compliance and Environmental Constraints for Edge Grasps.
    Bimbo J; Turco E; Ghazaei Ardakani M; Pozzi M; Salvietti G; Bo V; Malvezzi M; Prattichizzo D
    Front Robot AI; 2019; 6():135. PubMed ID: 33501150
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Learning the signatures of the human grasp using a scalable tactile glove.
    Sundaram S; Kellnhofer P; Li Y; Zhu JY; Torralba A; Matusik W
    Nature; 2019 May; 569(7758):698-702. PubMed ID: 31142856
    [TBL] [Abstract][Full Text] [Related]  

  • 14. EMG-driven shared human-robot compliant control for in-hand object manipulation in hand prostheses.
    Khadivar F; Mendez V; Correia C; Batzianoulis I; Billard A; Micera S
    J Neural Eng; 2022 Dec; 19(6):. PubMed ID: 36384035
    [No Abstract]   [Full Text] [Related]  

  • 15. Recognition of Upper Limb Action Intention Based on IMU.
    Cui JW; Li ZG; Du H; Yan BY; Lu PD
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271101
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Gaze, visual, myoelectric, and inertial data of grasps for intelligent prosthetics.
    Cognolato M; Gijsberts A; Gregori V; Saetta G; Giacomino K; Hager AM; Gigli A; Faccio D; Tiengo C; Bassetto F; Caputo B; Brugger P; Atzori M; Müller H
    Sci Data; 2020 Feb; 7(1):43. PubMed ID: 32041965
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Decoding the grasping intention from electromyography during reaching motions.
    Batzianoulis I; Krausz NE; Simon AM; Hargrove L; Billard A
    J Neuroeng Rehabil; 2018 Jun; 15(1):57. PubMed ID: 29940991
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Quantitative Investigation of Hand Grasp Functionality: Hand Joint Motion Correlation, Independence, and Grasping Behavior.
    Liu Y; Zeng B; Zhang T; Jiang L; Liu H; Ming D
    Appl Bionics Biomech; 2021; 2021():2787832. PubMed ID: 34899980
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Upper-Limb Electromyogram Classification of Reaching-to-Grasping Tasks Based on Convolutional Neural Networks for Control of a Prosthetic Hand.
    Kim KT; Park S; Lim TH; Lee SJ
    Front Neurosci; 2021; 15():733359. PubMed ID: 34712114
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Multichannel haptic feedback unlocks prosthetic hand dexterity.
    Abd MA; Ingicco J; Hutchinson DT; Tognoli E; Engeberg ED
    Sci Rep; 2022 Feb; 12(1):2323. PubMed ID: 35149695
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.