A multi-modal emotion recogniser based on the integration of multiple fusion methods
Descripción del Articulo
People naturally express emotions in simultaneous different ways. Thus, multimodal methods are becoming popular for emotion recognition and analysis of reactions to many aspects of daily life. This research work presents a multimodal method for emotion recognition from images. The multi-modal method...
| Autor: | |
|---|---|
| Formato: | tesis de grado |
| Fecha de Publicación: | 2021 |
| Institución: | Universidad Católica San Pablo |
| Repositorio: | UCSP-Institucional |
| Lenguaje: | inglés |
| OAI Identifier: | oai:repositorio.ucsp.edu.pe:20.500.12590/16940 |
| Enlace del recurso: | https://hdl.handle.net/20.500.12590/16940 |
| Nivel de acceso: | acceso abierto |
| Materia: | Emotion recognition Multi-modal Method Multiple Fusion Methods https://purl.org/pe-repo/ocde/ford#1.02.01 |
| Sumario: | People naturally express emotions in simultaneous different ways. Thus, multimodal methods are becoming popular for emotion recognition and analysis of reactions to many aspects of daily life. This research work presents a multimodal method for emotion recognition from images. The multi-modal method analyses facial expressions, body gestures and the characteristics of the body and the environment to determine an emotional state, processing each modality with a specialised deep learning model and then applying the proposed fusion method. The fusion method, called EmbraceNet+, consists of a branched architecture that integrates the EmbraceNet fusion method with other fusion methods. The tests carried out on an adaptation of the EMOTIC dataset show that the proposed multi-modal method is effective and improves the results obtained by individual processings, as well as competing with other state-ofthe-art methods. The proposed method has many areas of application because it seeks to recognise emotions in any situation. Likewise, the proposed fusion method can be used in any multi-modal deep learning-based model. |
|---|
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).