Skip to content

Latest commit

 

History

History
 
 

emotion_ferplus

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

FER+ Emotion Recognition

Description

This model is a deep convolutional neural network for emotion recognition in faces.

Model

Model Checksum Download (with sample test data) ONNX version Opset version
Emotion FERPlus MD5 32.4 MB 1.0 2
MD5 32.4 MB 1.2 7

Paper

"Training Deep Networks for Facial Expression Recognition with Crowd-Sourced Label Distribution" arXiv:1608.01041

Dataset

The model is trained on the FER+ annotations for the standard Emotion FER dataset, as described in the above paper.

Source

The model is trained in CNTK, using the cross entropy training mode. You can find the source code here.

Inference

Input

The model expects input of the shape (Nx1x64x64), where N is the batch size.

Preprocessing

Given a numpy array img representing the image you would like to score:

import numpy as np

def preprocess(img):
  input_shape = (1, 64, 64)
  img = np.resize(img, input_shape)
  img = np.expand_dims(img, axis=0) #batchify
  return img

Output

The model outputs a (1x8) array of scores corresponding to the 8 emotion classes, where the labels map as follows:
emotion_table = {'neutral':0, 'happiness':1, 'surprise':2, 'sadness':3, 'anger':4, 'disgust':5, 'fear':6, 'contempt':7}

Postprocessing

Route the model output through a softmax function to map the aggregated activations across the network to probabilities across the 8 classes.

import numpy as np

def softmax(scores):
  # your softmax function
  
def postprocess(scores):
  ''' 
  This function takes the scores generated by the network and returns the class IDs in decreasing 
  order of probability.
  '''
  prob = softmax(scores)
  prob = np.squeeze(prob)
  classes = np.argsort(prob)[::-1]
  return classes

Sample test data

Sets of sample input and output files are provided in

  • serialized protobuf TensorProtos (.pb), which are stored in the folders test_data_set_*/.

License

MIT