Facial Emotion Recognition

Two different methods to detect facial emotions of a human face

Prathibha Perera
Level Up Coding

--

GIF by Author

Facial emotions can be introduced as a non-verbal communication method. People express their feelings and expression through words but facial emotions can be used to enhance communication. Facial emotion recognition is a trendy topic these days. Basically identified facial emotions are Happy, Sad, Angry, Neutral, Fear, Disgust, Surprise, and all together they are 7.

Human facial emotion recognition can be achieved in real-time or using a static image. This article provides you to detect facial emotion using a static image by two major methods. I used Google Colab to practice the following codes.

1. FER library

It is a Python library that was built upon a deep neural network using Tensorflow, Keras libraries, and the dataset used is from Kaggle's competition Challenges in Representation Learning: Facial Expression Recognition Challenge. FER requires dependencies as Python > 3.6, OpenCV >=3.2, and Tensorflow>=1.7.0. If you do not install OpenCV and Tensorflow, use the commands below.

pip install opencv-contrib-python
pip install tensorflow

Then install FER library using,

pip install fer

Try to follow code snip.

from fer import FER
import matplotlib.pyplot as plt
%matplotlib inline
test_image = plt.imread("/content/Merlin_1.jpg")# Faces by default are detected using OpenCV's Haar Cascade classifier. To use the more accurate MTCNN network, add the parameter
emo_detector = FER(mtcnn=True)
# Capture all the emotions on the image
captured_emotions = emo_detector.detect_emotions(test_image)
# Print all captured emotions with the image
print(captured_emotions)
plt.imshow(test_image)

You will get a result like below.

Image by Author

Further, you can use the following code to get extracted emotions in the image,

captured_emotions
Image by Author-captured emotions

Then use the following code snip to extract tp most emotion from your image

print(emo_detector.top_emotion(test_image))
Image by Author-top most emotion

2. DeepFace

It is a lightweight facial recognition and facial attribute analysis framework used in Python and it is a deep learning facial recognition system created by a research group at Facebook. Using DeepFace, we can analyze facial attributes like age, gender, emotion, and race. It is important to mention that using DeepFace can be achieved a 97.35% ± 0.25% accuracy level. Let’s focus on how to deal with DeepFace in Python.

pip install deepface

After installing deepface, use the following code to load the image and plot the image.

from deepface import DeepFace
import cv2
import matplotlib.pyplot as plt
img_path = 'Merlin_1.jpg'
img = cv2.imread(img_path)
plt.imshow(img[:, :,::-1])
Image by Author

Then using the following commands you can find demographic information of the image.

demography = DeepFace.analyze(img_path)
demography
Image by Author-demography information of the image

Then we can extract only emotion data by using the following command.

demography['emotion']
Image by Author-emotion data of the image

Then if you want to extract the most prominent emotion of the image, use the following code.

demography['dominant_emotion']
Image by Author-dominant emotion

Done Guys… 😀

We have discussed two methods that you can use to recognize the facial emotions of a human face.

Hope you can collect some important facts about recognizing facial emotion and expression using FER and DeepFace in Python.

A future article will focus on differentiation of the facial emotion recognition in the hearing-impaired community and how to train a data set to collect real-time facial emotion recognition.

Thank you for reading and hit a clap if you enjoy this article. 🙌

--

--