Redirigiendo al acceso original de articulo en 16 segundos...
Inicio  /  Applied Sciences  /  Vol: 13 Par: 9 (2023)  /  Artículo
ARTÍCULO
TITULO

Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study

Eike Jakubowitz    
Thekla Feist    
Alina Obermeier    
Carina Gempfer    
Christof Hurschler    
Henning Windhagen and Max-Heinrich Laves    

Resumen

Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.

 Artículos similares