|
|
|
Ahmed Eid and Friedhelm Schwenker
Hand gestures are an essential part of human-to-human communication and interaction and, therefore, of technical applications. The aim is increasingly to achieve interaction between humans and computers that is as natural as possible, for example, by mea...
ver más
|
|
|
|
|
|
|
Wensheng Chen, Yinxi Niu, Zhenhua Gan, Baoping Xiong and Shan Huang
Enhancing information representation in electromyography (EMG) signals is pivotal for interpreting human movement intentions. Traditional methods often concentrate on specific aspects of EMG signals, such as the time or frequency domains, while overlooki...
ver más
|
|
|
|
|
|
|
Mahmoud Elmezain, Majed M. Alwateer, Rasha El-Agamy, Elsayed Atlam and Hani M. Ibrahim
Automatic key gesture detection and recognition are difficult tasks in Human?Computer Interaction due to the need to spot the start and the end points of the gesture of interest. By integrating Hidden Markov Models (HMMs) and Deep Neural Networks (DNNs),...
ver más
|
|
|
|
|
|
|
Roberto De Fazio, Vincenzo Mariano Mastronardi, Matteo Petruzzi, Massimo De Vittorio and Paolo Visconti
Human?machine interaction (HMI) refers to systems enabling communication between machines and humans. Systems for human?machine interfaces have advanced significantly in terms of materials, device design, and production methods. Energy supply units, logi...
ver más
|
|
|
|
|
|
|
Amrutha K, Prabu P and Ramesh Chandra Poonia
Sign language is a natural, structured, and complete form of communication to exchange information. Non-verbal communicators, also referred to as hearing impaired and hard of hearing (HI&HH), consider sign language an elemental mode of communication ...
ver más
|
|
|
|
|
|
|
Pablo Sarabia, Alvaro Araujo, Luis Antonio Sarabia and María de la Cruz Ortiz
Surface electromyography (sEMG) plays a crucial role in several applications, such as for prosthetic controls, human?machine interfaces (HMI), rehabilitation, and disease diagnosis. These applications are usually occurring in real-time, so the classifier...
ver más
|
|
|
|
|
|
|
Oliver Ohneiser, Marcus Biella, Axel Schmugler and Matt Wallace
Current Air Traffic Controller working positions (CWPs) are reaching their capacity owing to increasing levels of air traffic. The multimodal CWP prototype TriControl combines automatic speech recognition, multitouch gestures, and eye-tracking, aiming fo...
ver más
|
|
|
|
|
|
|
Dinh-Son Tran, Ngoc-Huynh Ho, Hyung-Jeong Yang, Eu-Tteum Baek, Soo-Hyung Kim and Gueesang Lee
Using hand gestures is a natural method of interaction between humans and computers. We use gestures to express meaning and thoughts in our everyday conversations. Gesture-based interfaces are used in many applications in a variety of fields, such as sma...
ver más
|
|
|
|
|
|
|
Che-Ming Chang, Chern-Sheng Lin, Wei-Cheng Chen, Chung-Ting Chen and Yu-Liang Hsu
The human?machine interface with head control can be applied in many domains. This technology has the valuable application of helping people who cannot use their hands, enabling them to use a computer or speak. This study combines several image processin...
ver más
|
|
|
|
|
|
|
Miguel Pfitscher, Daniel Welfer, Evaristo José do Nascimento, Marco Antonio de Souza Leite Cuadros, Daniel Fernando Tello Gamarra
Pág. 121 - 134
In this paper, we use data from the Microsoft Kinect sensor that processes the captured imageof a person using and extracting the joints information on every frame. Then, we propose the creation ofan image derived from all the sequential frames of a gest...
ver más
|
|
|
|