Cheng Qiu’s scientific contributions

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (1)


Figure 1: Saliency map for trained models on Fer2013 and MaskFer
Figure 2: Framework for Perturb Scheme
Figure 3: Distribution of classes in Fer2013 dataset
Figure 4: Facial image after applying clustering n = 3 based on the attention
Figure 5: Performance for different number of clusters for the VGG16 architecture 2.1 Summary (Baseline) Model Accuracy F1

+1

Leaving Some Facial Features Behind
  • Preprint
  • File available

October 2024

·

4 Reads

Cheng Qiu

Facial expressions are crucial to human communication, offering insights into emotional states. This study examines how specific facial features influence emotion classification, using facial perturbations on the Fer2013 dataset. As expected, models trained on data with the removal of some important facial feature experienced up to an 85% accuracy drop when compared to baseline for emotions like happy and surprise. Surprisingly, for the emotion disgust, there seem to be slight improvement in accuracy for classifier after mask have been applied. Building on top of this observation, we applied a training scheme to mask out facial features during training, motivating our proposed Perturb Scheme. This scheme, with three phases-attention-based classification, pixel clustering, and feature-focused training, demonstrates improvements in classification accuracy. The experimental results obtained suggests there are some benefits to removing individual facial features in emotion recognition tasks.

Download