The BoLD (BOdy Language Dataset) Challenge is a competition aimed towards studying and understanding in-the-wild human emotion from body language. We have constructed a large-scale, fully annotated dataset as well as a model evaluation and benchmarking platform to compare model performance towards this objective.
Computerized bodily emotion expression recognition capabilities have the potential to enable a large number of innovative applications including information management and retrieval, public safety, improving patient care, and social media. In this challenge, we invite researchers to develop a model which can have computers understand human emotion from spatiotemporal and human pose data from the BoLD - Body Language Dataset.
Given a selected entity from a short video sequence, the model must evaluate both the 26 discrete emotion cateogries (Peace, Affection, Esteem, Anticipation, Engagement, Confidence, Happiness, Pleasure, Excitement, Surprise, Sympaty, Doubt/confusion, Disconnection, Fatigue, Embarrassment, Yearning, Disapproval, Aversion, Annoyance, Anger, Sensitivity, Sadness, Disquietment, Fear, Pain, Suffering) and the continuous emotion values (Valence, Arousal, and Dominance).
We evaluate all prediction results on test set submitted to our evaluation server by comparing the results to our holdout ground truth set.
For the evaluation of the categorical outputs, we use the Average Precision
(AP, area under the precision-recall curve) and Area Under the Receiver Operating Characteristic (ROC AUC).
For the evaluation of the continuous outputs, we use the \(R^2\) metric to evaluate the regression values.
The overall model performance is compared with Emotion Recognition Score (ERS), which is defined in the paper.
For submissions to be qualified to the BEEU, teams must observe and adhere to the following rules:
If you are interested in joining the challenge, please register here:
RegisterLoginTo see the entire menu bar on top of this site, with registration and login information, please use a large enough browser.