Abstract:
Facial emotion recognition has received increasing attention in recent years due to its
potential applications in various fields such as human-computer interaction, security,
and healthcare. In this context, the orientation of a face has been identified as an
important factor affecting the accuracy of facial emotion recognition.
Emotions can be in different forms, this thesis will focus on facial emotional expressions.
The importance of facial emotion recognition is crucial in daily life because it
can be used to help people in case of emergency or for quick crime prevention.
Facial Emotion recognition could be used in various applications such as HCI, driver
warning systems, automated tutoring systems, picture and video retrieval, and smart
surroundings.
Two methodological approaches were used in this research: the baseline model and
the proposed model. All two models classify the face orientation directions and facial
emotions. The models will use Hopenet to identify head pose direction angles such
as pitch, yaw, and roll then to determine one of the directions, namely, forward, left,
right, up, and down.
Pre-trained models such as MobileNetV3-small, ResNet-18, GoogleNet, and others
will be used to classify emotions and find the connection between facial emotion
classification and head pose orientation.