Facial Expressions and Emotion Database



The FG-NET Database with Facial Expressions and Emotions was recorded in a project at the Technical University Munich. It is an image database containing face images showing a number of subjects performing the six different basic emotions defined by Eckman & Friesen. The database has been developed in an attempt to assist researchers who investigate the effects of different facial expressions. It was recorded in conjunction with the master thesis project of Michael Hawellek. The database has been generated as part of the European Union project FG-NET (Face and Gesture Recognition Research Network). The following document describes the contents of the database and provides information related to the use and distribution of the database.

Background and Setup

One of the dominating problems when gathering a database with emotions and facial expressions is the phenomen that played emotions differ from the natural ones. Therefore one of the underlying paradigms of this database is to let the observed people react as natural as possible. As a consequence, it was tried to wake real emotions by playing video clips or still images after a short introduction phase instead of telling the person to play a role. This includes that head moves in all directions are also allowed. The covered emotions include 1. Happiness, 2. Disgust, 3. Anger, 4. Fear, 5. Sadness, 6. Surprise and 7. Neutral.

To have a simple but defined environment, it was chosen to place a camera on top of a regular 19'' computer screen. With a dual headed workstation it is then possible to play video a move while simultaneously start the capturing process at the right expected time, when the emoion will begin. A sketch of the setup is depicted below.

The left video below shows a short introduction and a description how the emotion can typically look like. The second clip is then played with the intention to cause a spontaneous behaviour of the test person.

Below are some preprocessed and cropped example images together with an animated image sequence from a person reacting to the initiated animation sequence, which here is happiness. As in the example below, all acquired sequences are starting from the neutral state passing into the emotional state.




The images were acquired using a Sony XC-999P camera equipped with a 8mm COSMICAR 1:1.4 television lens. A BTTV 878 framegrabber card was used to grab the images with a size of 640x480 pixels, a colour depth of 24 bits and a framerate of 25 frames per second. Due to capacity reasons, the images where converted into 8 Bit JPEG-compressed images with a size of 320x240.

The database contains material gathered from 18 different individuals so far. It is intended to expand this gallery in the future. Each individual performed all six desired actions three times. Additionally three sequences doing no expressions at all are recorded. Alltogether this gives an amount of 399 sequences. Depending on the kind of emotion, a single recorded sequence can take up to several seconds.

Structure of the database

The database will be distributed in form of password protected ZIP-archive and a collection of MPEG compressed movies. After extraction the images are separately stored in subdirectories as follows: {anger,disgs,fears,happy,neutr,sadns,surpr}/%.4d_[123]/p_%.3d.pgm. A transcription with the metadata of the start, apex, and hold frame can be found here. (Thanks to Christoph Mayer!) A slightly more detailed description can be found in this document.

Distribution, Conditions and Acknowledgements

The database will be made available to members of the FGnet consortium or upon request. Researchers wishing to obtain the database should send an email to: frank.wallhoff@jade-hs.de. Together with the email this form should be completed and send as an attachment.

The database must only be used for research purposes. Researchers using the database may publish them in scientific journals and conference proceedings. The database may not be used for commercial purposes. Users of the database are expected to acknowledge the FG-NET consortium.Usage must be indicated by citing this article:

Frank Wallhoff; Bjorn Schuller; Michael Hawellek; Gerhard Rigoll: Efficient Recognition of Authentic Dynamic Facial Expressions on the Feedtum Database IEEE ICME, page 493-496. IEEE Computer Society, (2006)

  author = {Wallhoff, Frank and Schuller, Bj√∂rn and Hawellek, Michael and Rigoll, Gerhard},
  booktitle = {ICME},
  crossref = {conf/icmcs/2006},
  ee = {http://doi.ieeecomputersociety.org/10.1109/ICME.2006.262433},
  isbn = {1-4244-0367-7},
  keywords = {dblp},
  pages = {493-496},
  publisher = {IEEE Computer Society},
  title = {Efficient Recognition of Authentic Dynamic Facial Expressions on the Feedtum Database.},
  url = {http://dblp.uni-trier.de/db/conf/icmcs/icme2006.html#WallhoffSHR06},
  year = 2006