FRU-Adapter: Frame Recalibration Unit Adapter for Dynamic Facial Expression Recognition

Myungbeom Her, Hamza Ghulam Nabi, Ji Hyeong Han

Research output: Contribution to journalArticlepeer-review

Abstract

Dynamic facial expression recognition (DFER) is one of the most important challenges in computer vision, as it plays a crucial role in human–computer interaction. Recently, adapter-based approaches have been introduced into DFER, and they have achieved remarkable success. However, the adapters still suffer from the following problems: overlooking irrelevant frames and interference with pre-trained information. In this paper, we propose a frame recalibration unit adapter (FRU-Adapter) which combines the strengths of a frame recalibration unit (FRU) and temporal self-attention (T-SA) to address the aforementioned issues. The FRU initially recalibrates the frames by emphasizing important frames and suppressing less relevant frames. The recalibrated frames are then fed into T-SA to capture the correlations between meaningful frames. As a result, the FRU-Adapter captures enhanced temporal dependencies by considering the irrelevant frames in a clip. Furthermore, we propose a method for attaching the FRU-Adapter to each encoder layer in parallel to reduce the loss of pre-trained information. Notably, the FRU-Adapter uses only 2% of the total training parameters per task while achieving an improved accuracy. Extended experiments on DFER tasks show that the proposed FRU-Adapter not only outperforms the state-of-the-art models but also exhibits parameter efficiency. The source code will be made publicly available.

Original languageEnglish
Article number978
JournalElectronics (Switzerland)
Volume14
Issue number5
DOIs
StatePublished - Mar 2025

Keywords

  • dynamic facial expression recognition
  • frame recalibration unit
  • frame recalibration unit adapter

Fingerprint

Dive into the research topics of 'FRU-Adapter: Frame Recalibration Unit Adapter for Dynamic Facial Expression Recognition'. Together they form a unique fingerprint.

Cite this