A multi-modal emotion recogniser based on the integration of multiple fusion methods
Descripción del Articulo
People naturally express emotions in simultaneous different ways. Thus, multimodal methods are becoming popular for emotion recognition and analysis of reactions to many aspects of daily life. This research work presents a multimodal method for emotion recognition from images. The multi-modal method...
| Autor: | |
|---|---|
| Formato: | tesis de grado |
| Fecha de Publicación: | 2021 |
| Institución: | Universidad Católica San Pablo |
| Repositorio: | UCSP-Institucional |
| Lenguaje: | inglés |
| OAI Identifier: | oai:repositorio.ucsp.edu.pe:20.500.12590/16940 |
| Enlace del recurso: | https://hdl.handle.net/20.500.12590/16940 |
| Nivel de acceso: | acceso abierto |
| Materia: | Emotion recognition Multi-modal Method Multiple Fusion Methods https://purl.org/pe-repo/ocde/ford#1.02.01 |
| id |
UCSP_ed92398d3590ef938301bbfc0162f2e0 |
|---|---|
| oai_identifier_str |
oai:repositorio.ucsp.edu.pe:20.500.12590/16940 |
| network_acronym_str |
UCSP |
| network_name_str |
UCSP-Institucional |
| repository_id_str |
3854 |
| dc.title.es_PE.fl_str_mv |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| title |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| spellingShingle |
A multi-modal emotion recogniser based on the integration of multiple fusion methods Heredia Parillo, Juanpablo Andrew Emotion recognition Multi-modal Method Multiple Fusion Methods https://purl.org/pe-repo/ocde/ford#1.02.01 |
| title_short |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| title_full |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| title_fullStr |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| title_full_unstemmed |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| title_sort |
A multi-modal emotion recogniser based on the integration of multiple fusion methods |
| author |
Heredia Parillo, Juanpablo Andrew |
| author_facet |
Heredia Parillo, Juanpablo Andrew |
| author_role |
author |
| dc.contributor.advisor.fl_str_mv |
Ticona Herrera, Regina Paola |
| dc.contributor.author.fl_str_mv |
Heredia Parillo, Juanpablo Andrew |
| dc.subject.es_PE.fl_str_mv |
Emotion recognition Multi-modal Method Multiple Fusion Methods |
| topic |
Emotion recognition Multi-modal Method Multiple Fusion Methods https://purl.org/pe-repo/ocde/ford#1.02.01 |
| dc.subject.ocde.es_PE.fl_str_mv |
https://purl.org/pe-repo/ocde/ford#1.02.01 |
| description |
People naturally express emotions in simultaneous different ways. Thus, multimodal methods are becoming popular for emotion recognition and analysis of reactions to many aspects of daily life. This research work presents a multimodal method for emotion recognition from images. The multi-modal method analyses facial expressions, body gestures and the characteristics of the body and the environment to determine an emotional state, processing each modality with a specialised deep learning model and then applying the proposed fusion method. The fusion method, called EmbraceNet+, consists of a branched architecture that integrates the EmbraceNet fusion method with other fusion methods. The tests carried out on an adaptation of the EMOTIC dataset show that the proposed multi-modal method is effective and improves the results obtained by individual processings, as well as competing with other state-ofthe-art methods. The proposed method has many areas of application because it seeks to recognise emotions in any situation. Likewise, the proposed fusion method can be used in any multi-modal deep learning-based model. |
| publishDate |
2021 |
| dc.date.accessioned.none.fl_str_mv |
2021-11-29T03:04:43Z |
| dc.date.available.none.fl_str_mv |
2021-11-29T03:04:43Z |
| dc.date.issued.fl_str_mv |
2021 |
| dc.type.none.fl_str_mv |
info:eu-repo/semantics/bachelorThesis |
| dc.type.version.es_PE.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
| format |
bachelorThesis |
| status_str |
publishedVersion |
| dc.identifier.other.none.fl_str_mv |
1073589 |
| dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/20.500.12590/16940 |
| identifier_str_mv |
1073589 |
| url |
https://hdl.handle.net/20.500.12590/16940 |
| dc.language.iso.es_PE.fl_str_mv |
eng |
| language |
eng |
| dc.relation.ispartof.fl_str_mv |
SUNEDU |
| dc.rights.es_PE.fl_str_mv |
info:eu-repo/semantics/openAccess |
| dc.rights.uri.es_PE.fl_str_mv |
https://creativecommons.org/licenses/by/4.0/ |
| eu_rights_str_mv |
openAccess |
| rights_invalid_str_mv |
https://creativecommons.org/licenses/by/4.0/ |
| dc.format.es_PE.fl_str_mv |
application/pdf |
| dc.publisher.es_PE.fl_str_mv |
Universidad Católica San Pablo |
| dc.publisher.country.es_PE.fl_str_mv |
PE |
| dc.source.es_PE.fl_str_mv |
Universidad Católica San Pablo Repositorio Institucional - UCSP |
| dc.source.none.fl_str_mv |
reponame:UCSP-Institucional instname:Universidad Católica San Pablo instacron:UCSP |
| instname_str |
Universidad Católica San Pablo |
| instacron_str |
UCSP |
| institution |
UCSP |
| reponame_str |
UCSP-Institucional |
| collection |
UCSP-Institucional |
| bitstream.url.fl_str_mv |
https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/1f104511-897c-426d-a7c9-c45d48bf4e75/download https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/8e4ecfb7-6ebb-49bc-840b-b082bce1abc9/download https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/9366cdf2-d7ca-4395-b189-eb6fef68ab65/download https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/a981c5da-c0f7-4c6e-ae8a-8a8987a2d644/download |
| bitstream.checksum.fl_str_mv |
d2c4e66ea09fefa795da1acb31ae7654 8a4605be74aa9ea9d79846c1fba20a33 f202742bc5b8fec37f9413d702c57a92 69a9eea7c97861ff846d530099484fb3 |
| bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 |
| repository.name.fl_str_mv |
Repositorio Institucional de la Universidad Católica San Pablo |
| repository.mail.fl_str_mv |
dspace@ucsp.edu.pe |
| _version_ |
1851053029721112576 |
| spelling |
Ticona Herrera, Regina PaolaHeredia Parillo, Juanpablo Andrew2021-11-29T03:04:43Z2021-11-29T03:04:43Z20211073589https://hdl.handle.net/20.500.12590/16940People naturally express emotions in simultaneous different ways. Thus, multimodal methods are becoming popular for emotion recognition and analysis of reactions to many aspects of daily life. This research work presents a multimodal method for emotion recognition from images. The multi-modal method analyses facial expressions, body gestures and the characteristics of the body and the environment to determine an emotional state, processing each modality with a specialised deep learning model and then applying the proposed fusion method. The fusion method, called EmbraceNet+, consists of a branched architecture that integrates the EmbraceNet fusion method with other fusion methods. The tests carried out on an adaptation of the EMOTIC dataset show that the proposed multi-modal method is effective and improves the results obtained by individual processings, as well as competing with other state-ofthe-art methods. The proposed method has many areas of application because it seeks to recognise emotions in any situation. Likewise, the proposed fusion method can be used in any multi-modal deep learning-based model.Tesisapplication/pdfengUniversidad Católica San PabloPEinfo:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by/4.0/Universidad Católica San PabloRepositorio Institucional - UCSPreponame:UCSP-Institucionalinstname:Universidad Católica San Pabloinstacron:UCSPEmotion recognitionMulti-modal MethodMultiple Fusion Methodshttps://purl.org/pe-repo/ocde/ford#1.02.01A multi-modal emotion recogniser based on the integration of multiple fusion methodsinfo:eu-repo/semantics/bachelorThesisinfo:eu-repo/semantics/publishedVersionSUNEDULicenciado en Ciencia de la ComputaciónUniversidad Católica San Pablo. Departamento de Ciencia de la ComputaciónTítulo ProfesionalCiencia de la ComputaciónPrograma Profesional de Ciencia de la Computación75151804https://orcid.org/0000-0002-2605-571840207170https://purl.org/pe-repo/renati/type#tesishttps://purl.org/pe-repo/renati/level#tituloProfesional611016Jose Eduardo Ochoa LunaYessenia Deysi Yari RamosORIGINALHEREDIA_PARILLO_JUA_EMO.pdfHEREDIA_PARILLO_JUA_EMO.pdfapplication/pdf5830674https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/1f104511-897c-426d-a7c9-c45d48bf4e75/downloadd2c4e66ea09fefa795da1acb31ae7654MD51LICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/8e4ecfb7-6ebb-49bc-840b-b082bce1abc9/download8a4605be74aa9ea9d79846c1fba20a33MD52TEXTHEREDIA_PARILLO_JUA_EMO.pdf.txtHEREDIA_PARILLO_JUA_EMO.pdf.txtExtracted texttext/plain142736https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/9366cdf2-d7ca-4395-b189-eb6fef68ab65/downloadf202742bc5b8fec37f9413d702c57a92MD53THUMBNAILHEREDIA_PARILLO_JUA_EMO.pdf.jpgHEREDIA_PARILLO_JUA_EMO.pdf.jpgGenerated Thumbnailimage/jpeg4315https://repositorio.ucsp.edu.pe/backend/api/core/bitstreams/a981c5da-c0f7-4c6e-ae8a-8a8987a2d644/download69a9eea7c97861ff846d530099484fb3MD5420.500.12590/16940oai:repositorio.ucsp.edu.pe:20.500.12590/169402023-10-31 14:41:08.271https://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccessopen.accesshttps://repositorio.ucsp.edu.peRepositorio Institucional de la Universidad Católica San Pablodspace@ucsp.edu.peTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo= |
| score |
13.472619 |
Nota importante:
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).
La información contenida en este registro es de entera responsabilidad de la institución que gestiona el repositorio institucional donde esta contenido este documento o set de datos. El CONCYTEC no se hace responsable por los contenidos (publicaciones y/o datos) accesibles a través del Repositorio Nacional Digital de Ciencia, Tecnología e Innovación de Acceso Abierto (ALICIA).