Micro-Expression Analysis using Transformer Models
This project was done as under the supervision of Prof. Tanmay Verlekar, in collaboration with APPCAIR.
Abstract: Micro-Expressions are subtle and involuntary facial movements which are usually less than half a second long in duration. People believe they are the real expressions and feelings of a person. They also convey emotional leakage and are useful for lie detection. We tried our hand at using transformer models for Micro-Expression Analysis. We aimed at using Transformers at this complex and high detail task to check their performance. Given the small and complex dataset on Micro-Expression Analysis, we experimented with Image-to-Video Transfer Learning on Transformers and were able to achieve near state-of-the art results (about 68% six class classification accuracy) using our modifications to the architecture and loss function.