|
|
|
Thomas Kopalidis, Vassilios Solachidis, Nicholas Vretos and Petros Daras
Recent technological developments have enabled computers to identify and categorize facial expressions to determine a person?s emotional state in an image or a video. This process, called ?Facial Expression Recognition (FER)?, has become one of the most ...
ver más
|
|
|
|
|
|
|
Ahmed J. Obaid and Hassanain K. Alrammahi
Recognizing facial expressions plays a crucial role in various multimedia applications, such as human?computer interactions and the functioning of autonomous vehicles. This paper introduces a hybrid feature extraction network model to bolster the discrim...
ver más
|
|
|
|
|
|
|
Zouheir Trabelsi, Fady Alnajjar, Medha Mohan Ambali Parambil, Munkhjargal Gochoo and Luqman Ali
Effective classroom instruction requires monitoring student participation and interaction during class, identifying cues to simulate their attention. The ability of teachers to analyze and evaluate students? classroom behavior is becoming a crucial crite...
ver más
|
|
|
|
|
|
|
Alexandros Rouchitsas and Håkan Alm
Pedestrians base their street-crossing decisions on vehicle-centric as well as driver-centric cues. In the future, however, drivers of autonomous vehicles will be preoccupied with non-driving related activities and will thus be unable to provide pedestri...
ver más
|
|
|
|
|
|
|
Aayushi Chaudhari, Chintan Bhatt, Achyut Krishna and Pier Luigi Mazzeo
In several fields nowadays, automated emotion recognition has been shown to be a highly powerful tool. Mapping different facial expressions to their respective emotional states is the main objective of facial emotion recognition (FER). In this study, fac...
ver más
|
|
|
|
|
|
|
Gayathri Soman, M. V. Vivek, M. V. Judy, Elpiniki Papageorgiou and Vassilis C. Gerogiannis
Focusing on emotion recognition, this paper addresses the task of emotion classification and its performance with respect to accuracy, by investigating the capabilities of a distributed ensemble model using precision-based weighted blending. Research on ...
ver más
|
|
|
|
|
|
|
Ali Bou Nassif, Ismail Shahin, Mohammed Lataifeh, Ashraf Elnagar and Nawel Nemmour
Speech signals carry various bits of information relevant to the speaker such as age, gender, accent, language, health, and emotions. Emotions are conveyed through modulations of facial and vocal expressions. This paper conducts an empirical comparison o...
ver más
|
|
|
|
|
|
|
Yukuan Sun, Hangming Zhang, Shengjiao Yang and Jianming Wang
Sarcasm often manifests itself in some implicit language and exaggerated expressions. For instance, an elongated word, a sarcastic phrase, or a change of tone. Most research on sarcasm detection has recently been based on text and image information. In t...
ver más
|
|
|
|
|
|
|
Jong-Gyu Shin, Ga-Young Choi, Han-Jeong Hwang and Sang-Ho Kim
With the development of artificial intelligence technology, voice-based intelligent systems (VISs), such as AI speakers and virtual assistants, are intervening in human life. VISs are emerging in a new way, called human?AI interaction, which is different...
ver más
|
|
|
|
|
|
|
Chang-Min Kim, Ellen J. Hong, Kyungyong Chung and Roy C. Park
As people communicate with each other, they use gestures and facial expressions as a means to convey and understand emotional state. Non-verbal means of communication are essential to understanding, based on external clues to a person?s emotional state. ...
ver más
|
|
|
|