Emotion Detection Through Facial Feature Analysis
Emotion Detection Through Facial Feature Analysis
OUTPUT
Motivation
For humans, understanding and identifying emotions can be extremely interesting and useful, as genuine emotions are at most only partially controllable and often display their presence through facial expressions of the person experiencing them. A person’s emotions can sometimes be very distinct and obvious and at other times may very transient and difficult to notice; however, as long as their cues are visually present, it ostensibly possible for a computer to perform image processing and classification of that expression. There are many applications, ranging from entertainment, social media, criminal justice, to healthcare where the automated ability to process and detect emotion of a person can have functional benefits. For example, content providers can judge a person’s authentic
and immediate emotional response and tune their product accordingly, or health tracking apps that would monitor emotional stability and fluctuation of a user.
Goal
Goal
The goal of this project is to implement an algorithm that will be able to perform image processing on pictures of faces with expressions that can be classified as belonging to or indicating a particular emotion from a predetermined list of the most distinct/common emotions, and then to classify that expression with some reasonable accuracy. The image processing will likely involve steps to clean, color balance, and appropriately segment the image. Certain features will need to be detected, such as the eyes, lips, and teeth. Those features can be isolated with segmentation, and inter-feature distances, angles, and curvatures can be measured. The algorithm can be trained using readily available facial
images from online databases, and then new testing images will be processed and classified, and the algorithm accuracy can be determined and tuned.
Tools
Tools
This project will be completed in MATLAB, without the use of Android.
References
1. Henry Candra, Mitchell Yuwono, Rifai Chai, Hung T. Nguyen, Steven Su, “Classification of
FacialEmotion Expression in the Application of Psychotherapy using Viola-Jones and Edge-Histogram of
Oriented Gradient”, 38th Annual International Conference of the IEEE Engineering in Medicine and
Biology Society (EMBC), 2016
2. Suchitra, Suja P., Shikha Tripathi, “Real-time emotion recognition from facial images using Raspberry
Pi II”, 3rd International Conference on Signal Processing and Integrated Networks (SPIN), 2016
3. Moon Hwan Kim, Young Hoon Joo, Jin Bae Pak, “Emotion Detection Algorithm Using Frontal Face
Image”, 12th International Computer Applications in Shipbuilding (ICCAS), 2005
FOR BASE PAPER PLEASE MAIL US AT ARNPRSTH@GMAIL.COM
DOWNLOAD PROJECT CODE : CLICK HERE
FOR BASE PAPER PLEASE MAIL US AT ARNPRSTH@GMAIL.COM
DOWNLOAD PROJECT CODE : CLICK HERE
The link for DOWNLOAD PROJECT CODE doesn't work!
ReplyDelete